26 January 2014

How to clean up your Facebook profile of old posts, messages, pictures, and videos

All of us have posts that we want to clean up from the past. They may be posts that are now embarrassing, posts with exes that we want gone off of our profile, or maybe you are interviewing for a job that you know asks for your credentials and you want to make sure your profile looks good. Facebook does not allow the average user to be able to easily see posts from a long time ago. If you are a geek and know how to use Facebook's FQL, you can query all previous posts for specific keywords. There are a couple of options:

  1. You can go into the general account setting and click on Download a copy of your Facebook data. This will download all of your facebook data divided up into separate files for each section of your profile, including all of your pictures and videos. Once you have the download, you can then open up the wall.htm file in Excel. It contains all of your wall posts from the beginning of your profile. In order to use this, you can search the spreadsheet for keywords. The results will have the exact date/time it was posted. You can then go to your Facebook profile and find that exact post by clicking on the year first, then the month, and finally scrolling through all posts in that month to find the specific post. You can then click on the drop-down and select Delete.
  2. There is a Facebook app called Search My Posts. This app will allow you to enter keywords and will parse through your profile and find all posts with that keyword. For posts on your profile, this will allow you to click on the Link to Post to take you directly to the post. You can then click on the drop-down and select Delete. This app will also list posts you did on other people's profiles, but does not have a link to the post. You will have to note the date/time, go to the user's profile, and then click on the year/month to parse through all posts for that month until you find your post. At that point, you can delete your post. The same goes for posts you made in groups.
As for recommendations, this solution pertains to both of the above posts. You cannot remove a recommendation from Facebook. You must go to the web page that you clicked on the Facebook recommendation. The way to do this is to click on the link to the Facebook post, if you are using option 2, or go to the date it was posted for option 1, and then click on the link in the Facebook post to take you to the page that was recommended. It is there that you can now click on the Facebook recommend again to remove the recommendation. Once you remove the recommendation, the post disappears off of Facebook.

As far as Facebook messages goes, Facebook has archived the messages in the past, meaning they were not deleted. When you go into your messages box, there is an archive box that you can click on. It is there that you will find all of the messages from the past that were archived. They can now be deleted. You first click on the message. Next, you click on actions and click Delete Conversation. That will permanently delete the message from your Facebook profile. 

For your pictures and videos, I suggest using the first option, or manually parsing through them on your profile. 

Keeping your Facebook profile safe and out of trouble in civil court, criminal court, work, and home life is wise. I would leave out alcohol, guns, sex, violence, and topics of controversy off of Facebook. These topics can cause you to lose your job, lose your family, be sued, or incriminate you in a criminal court system. If you have any of these topics on Facebook, I suggest cleaning them off. I am not a criminal/civil/psychology expert, but common sense will tell you these things, especially in today's digital world. Information is magnified and misconstrued by many because the reader injects their own emotion into the text, not yours, thereby possibly leading to ramifications. 

How to search all of your Youtube comments

If you have been trying to find a way to search through all of your Youtube comments you have ever posted, there is a way. Youtube does not seem to have a feature that allows for you to see all of your comments, but you can use their parent company, Google, to find them. Go to google.com and enter the line below. Change the <Youtube Account Name> to your Youtube account name and then hit <enter>. All of your Youtube comments will pop up.

site:youtube.com/all_comments "<Youtube Account Name>"

19 January 2014

How to exclude a directory on the destination while using the mirror switch in a Robocopy

I ran into a situation where we had to mirror our local DFS to all of the remote DFS locations. The problem was that there was one directory in each of the remote shares that was unique to that location and had to remain untouched. The first thing I thought of was to have powershell parse through the base directory trees and perform a recursive robocopy on each base. There were 177 with thousands of subdirectories under each one. That was just too much. I finally figured out to make sure there was also a directory in the source DFS named the same as the one in the destination that is to remain unchanged. Once that directory was in the source, I was able to exclude the directory from the source and it left the directory in the destination untouched. 

16 January 2014

Deploying enormous packages across a WAN

As many of you might know, deploying huge packages through SCCM can be a challenge. The first question you might ask is, define enormous. IMO, anything over the default 5GB client cache size is enormous. Working in the architecture/engineering industry, I encounter this issue quite often with Bentley, Autodesk, and Adobe applications. There are several options to address this.

  1. You can increase the cache sizes to accommodate the size, but when you are talking about packages that are 40+ gigs in size, that is really not an option, especially when you are using SSD drives, where you're sacrificing space for speed. The issue is that you now need double the space on a system to perform the install. Another issue is pushing the package to the distribution points. If you do it across your LAN, it's going to take a LONG time, even with 10 meg links. You could use the option of exporting the package to a thumb drive from SCCM, and mailing it to the locations to update the distribution points. You will also need to re-size the cache size on all machines before and after the deployment. 
  2. You can create a package in SCCM that points to a network share and runs from it. To use this option, you need a network share in all remote locations. This is the option I use to facilitate enormous packages. We keep our remote network shares updated via robocopy from our home office share. This not only lets us distribute from SCCM, but it also provides a copy of the package in remote locations in the case we need to run a repair or install a single copy. 
If you choose option 2, here is a guide to how we implement this.

  1. Create a single application deployment in SCCM
  2. Under the Deployment Types, create a deployment type for each location that has a OU. This will allow for you to make the deployment type specific to that site. I name the deployment type by the name of our office location (i.e. Nashville, Atlanta, etc.)
  3. Under the content tab, leave content location blank.
  4. Under the programs tab, enter the installation and uninstall programs, making sure you populate the start in fields beneath them with the location of the install source within the OU specified.
  5. Once you have all of the tabs completed, go back to the Requirements tab, click add. Under Category, select device and under condition, select organizational unit. Click add and select Browse to select the OU for this specific location. 
  6. Repeat steps 1 through 3 until you have created a deployment type for all of your remote locations.
That is all that is to pointing a deployment package to the install source and not using the distribution points. 

NOTE: In order to make this work, you will need a network share at each OU, otherwise the install will take place across the WAN. Also, the remote network shares will need the package robocopied to them first. 

15 January 2014

Adding environmental variables across all profiles without a reboot

This can be done without having to reboot the machine. You want to use setx.exe to create the new variable. The format to do this is:

  • setx.exe <environmental variable name> <value> /m 
  • i.e. setx.exe MIG_IGNORE_PROFILE_MISSING 1 /m
In order to not have to reboot the machine, go to task manager and end the process tree of explorer.exe. Open explorer.exe back up and the new environmental variable is now present. Ending the process of just explorer.exe will not make the environmental variable visible.

13 January 2014

Apply Updates and Hotfixes online

This script is intended to apply updates and hotfixes after the OS has been installed and is online. There are windows updates that cannot be installed offline, as described in this blog. For these updates, I have written a script that will apply the updates using wusa.exe. In order to properly install them, I recommend numbering the filenames so that the script will read them in the correct sequence. I added 01-, 02-, 03-, etc in front of the filename. The script now reads the files in that sequence. The script has been written so that it only reads msu files. You can download this script here.

 #   Author: Mick Pletcher  
 #    Date: 01 January 2014  
 #   Program: Online Windows Updates Installer  
 #Declare Local Memory  
 Set-Variable -Name File -Scope Local -Force  
 Function DeclareGlobalMemory {  
      Set-Variable -Name Files -Scope Global -Force  
      Set-Variable -Name RelativePath -Scope Global -Force  
 Function GlobalMemoryCleanup {  
      Remove-Variable -Name Files -Scope Global -Force  
      Remove-Variable -Name RelativePath -Scope Global -Force  
 Function RenameWindow ($Title) {  
      #Declare Local Memory  
      Set-Variable -Name a -Scope Local -Force  
      $a = (Get-Host).UI.RawUI  
      $a.WindowTitle = $Title  
      #Cleanup Local Memory  
      Remove-Variable -Name a -Scope Local -Force  
 Function GetRelativePath {  
      $Global:RelativePath=(split-path $SCRIPT:MyInvocation.MyCommand.Path -parent)+"\"  
 Function GetFiles {  
      $Global:Files = Get-ChildItem -Path $Global:RelativePath  
 Function CreateTempFolder ($FolderName) {  
      $FolderName = "c:\temp\"+$FolderName  
      New-Item -Path $FolderName -ItemType Directory -Force  
 Function RemoveTempFolder ($FolderName) {  
      $FolderName = "c:\temp\"+$FolderName  
      Remove-Item -Path $FolderName -Recurse -Force  
 Function ExtractCAB ($Name) {  
      #Declare Local Memory  
      Set-Variable -Name arguments -Scope Local -Force  
      Set-Variable -Name Dest -Scope Local -Force  
      Set-Variable -Name Source -Scope Local -Force  
      $Source = $Global:RelativePath+$Name  
      $Dest = "c:\temp\"+$Name.Substring(0,$Name.Length-4)  
      $arguments = "–F:*"+[char]32+$Source+[char]32+$Dest  
      Start-Process -FilePath "expand.exe" -ArgumentList $arguments -Wait -PassThru  
      #Cleanup Local Memory  
      Remove-Variable -Name arguments -Scope Local -Force  
      Remove-Variable -Name Dest -Scope Local -Force  
      Remove-Variable -Name Source -Scope Local -Force  
 #Function ApplyWindowsUpdate ($Name,$Directory) {  
 #     #Declare Local Memory  
 #     Set-Variable -Name arguments -Scope Local -Force  
 #     $Name = $Name.Substring(0,$Name.Length-4)  
 #     $Name = $Name+".cab"  
 #     $arguments = "/Online /Add-Package /PackagePath:"+"c:\temp\"+$Directory+"\"+$Name  
 #     Start-Process -FilePath "DISM.exe" -ArgumentList $arguments -Wait -PassThru  
 #     #Cleanup Local Memory  
 #     Remove-Variable -Name arguments -Scope Local -Force  
 Function ApplyWindowsUpdate ($Name) {  
      #Declare Local Memory  
      Set-Variable -Name App -Scope Local -Force  
      Set-Variable -Name arguments -Scope Local -Force  
      Set-Variable -Name index -Scope Local -Force  
      Set-Variable -Name Result -Scope Local -Force  
      $App = $Name  
      $index = $App.IndexOf("-KB")+1  
      $App = $App.Substring(0,$App.Length-4)  
      $App = $App.Substring($index)  
      Write-Host "Installing"$App"....." -NoNewline  
      $arguments = $Global:RelativePath+$Name+[char]32+"/quiet /norestart"  
      $Result = (Start-Process -FilePath "wusa.exe" -ArgumentList $arguments -Wait -PassThru).ExitCode  
      If ($Result -eq "3010") {  
           Write-Host "Succeeded" -ForegroundColor Yellow  
      } else {  
           Write-Host "Failed with error code"$Result -ForegroundColor Red  
      #Cleanup Local Memory  
      Remove-Variable -Name App -Scope Local -Force  
      Remove-Variable -Name arguments -Scope Local -Force  
      Remove-Variable -Name index -Scope Local -Force  
      Remove-Variable -Name Result -Scope Local -Force  
 RenameWindow "Windows Updates"  
 foreach ($File in $Files) {  
      if (($File.Attributes -ne "Directory") -and ($File.Name -like "*.msu")) {  
           ApplyWindowsUpdate $File.Name  
 #Cleanup Local Memory  
 Remove-Variable -Name File -Scope Local -Force  

09 January 2014

MDT 2013 and Pentium 4 Systems

My firm is getting the first tablet machines in that will be Windows 8.1. This requires MDT 2013 and ADK 8.1. The problem is that we still have a few machines that are Pentium 4 systems, specifically Dell Precision Workstation 380s. The problem is that the Pentium 4 processor is not compatible with Windows 8.1 and ADK. The system will crash when the Windows 8.1 PE begins to load with the following message:

Your PC needs to restart.
Please hold down the power button.
Error Code:0x0000005D

The resolution provided by Microsoft is to either upgrade the processor or downgrade to WAIK and MDT 2012.

There is a slight possibility of a resolution that I am going to work on when I get some time and that is to modify a WIM from MDT 2012 to point to the MDT 2013 server. I do not know if this will work or not. I did go in and replace the winpe.wim files MDT uses to build the boot wims with MDT 2012 winpe.wim files. MDT 2013 instantly failed. For now, the only two resolutions we have until we get the last of the older machines out of service is to have two different MDT servers, one for the old systems and one for the newer, or just create an image for the 380 and use the MDT 2013 for all of the rest.