Like many IT Pros, I'm a big fan of the utilities that make up the Sysinternals suite. A number of years ago, Microsoft created a "live" web directory (http:\\live.sysinternals.com\tools) that allowed you direct access to each utility. While this is very handy, personally I prefer to keep a local version. I used to periodically check the site and update my local folder, but now I can use PowerShell.
ManageEngine ADManager Plus - Download Free Trial
Exclusive offer on ADManager Plus for US and UK regions. Claim now!
#requires -version 3.0 <# Download Sysinternals tools from web to a local folder. **************************************************************** * DO NOT USE IN A PRODUCTION ENVIRONMENT UNTIL YOU HAVE TESTED * * THOROUGHLY IN A LAB ENVIRONMENT. USE AT YOUR OWN RISK. IF * * YOU DO NOT UNDERSTAND WHAT THIS SCRIPT DOES OR HOW IT WORKS, * * DO NOT USE IT OUTSIDE OF A SECURE, TEST SETTING. * **************************************************************** #> [cmdletbinding(SupportsShouldProcess)] Param( [Parameter(Position=0)] [ValidateScript({Test-Path -Path $_})] [string]$Destination = "G:\SysInternals" ) #start the WebClient service if it is not running if ((Get-Service WebClient).Status -eq 'Stopped') { Write-Verbose "Starting WebClient" #always start the webclient service even if using -Whatif Start-Service WebClient -WhatIf:$false $Stopped = $True } else { <# Define a variable to indicate service was already running so that we don't stop it. Making an assumption that the service is already running for a reason. #> $Stopped = $False } #get current files in destination $current = dir -Path $Destination -File Write-Host "Updating Sysinternals tools from \\live.sysinternals.com\tools to $destination" -ForegroundColor Cyan foreach ($file in $current) { #construct a path to the live web version and compare dates $online = Join-Path -path \\live.sysinternals.com\tools -ChildPath $file.name Write-Verbose "Testing $online" if ((Get-Item -Path $online).LastWriteTime.Date -ge $file.LastWriteTime.Date) { Copy-Item $online -Destination $Destination -PassThru } } Write-Host "Testing for online files not in $destination" -ForegroundColor Green #test for files online but not in the destination and copy them dir -path \\live.sysinternals.com\tools -file | Where {$current.name -notcontains $_.name} | Copy-Item -Destination $Destination -PassThru <# alternative but this might still copy files that haven't really changed Robocopy \\live.sysinternals.com\tools $destination /MIR #> if ( $Stopped ) { Write-Verbose "Stopping web client" #always stop the service even if using -Whatif Stop-Service WebClient -WhatIf:$False } Write-Host "Sysinternals Update Complete" -ForegroundColor Cyan #end of script
My version requires PowerShell 3.0. You need to have the WebClient service running in order to list the Internet files. My script has code to start and stop the service as needed. Although if it detects the service is already running, the script won't stop it under the assumption that you probably had it running for a reason.
The logic behind the script is pretty simple, if the date of the online version is greater than the local version, copy the file. One thing I ran into though is that there can be a discrepancy with the time stamps due to time zones and/or daylight savings time. I couldn't find an easy way to take those things into account so I opted to simply test the date and ignore the time.
The next thing I should do is set up a PowerShell scheduled job to run the script on a monthly basis. I hope you'll let me know how this works out for you.
An alternative is to use Chocolatey: http://chocolatey.org/packages/sysinternals. Thanks to Stefan Stranger (@sstranger) for pointing this out.
If you don’t have the WebClient service installed (such as in Windows Server 2012), you need to install the Desktop Experience. Also need to have the destination folder created in advance.
Yes on all counts. I never think about running the tools on a server so I didn’t think about that. And the script does in fact test for the existence of the folder. Of course, you could easily modify the script to create it for you should you so wish.
for a long time I’ve just been using “robocopy /xo /s /z \\live.sysinternals.com/tools $dest” but I’ve found that many times robocopy can’t find that UNC path…
Robocopy also needs the Webclient service running. If it fails or stops that would explain your problem.
I like the idea of doing the update with powershell. Its also a good excersice! I used the following.
wget.exe -N -nd -np -l 1 -r -w 3 http://live.sysinternals.com/
I’m giving this powershell thing a try
Certainly there are a number of ways to download items from the Internet. I’m not sure in this particular scenario if one technique is better than any others.
Thanks for sharing great stuff.
Could you have the script update the OS environment variables path so when typing the system will automatically find the tools instead of remembering to type the full path?
I don’t have anything specifically for that. If you want a permanent change, you could use PowerShell to modify the path settings in the registry. If you need help with that, I recommend the forums at PowerShell.org. A much better place to work on a problem than via blog comments.