Friday Fun: A SysInternals PowerShell Workflow

Over the years I’ve come up with a number of PowerShell tools to download the SysInternals tools to my desktop. And yes, I know that with PowerShell 5 and PowerShellGet I could download and install a SysInternals package. But that assumes the package is current.  But that’s not really the point. Instead I want to use today’s Friday Fun to offer you an example of using a workflow as a scripting tool. In this case, the goal is to download the SysInternals files from the Internet.

First, you’ll need to get a copy of the workflow from GitHub.

A workflow command is like a function, in that you need to load it into your PowerShell session such as dot sourcing the file.

. c:\scripts\Update-SysinternalsWorkflow.ps1

This will give you a new command.

image

The workflow can now be executed like any other command.

image

The workflow’s main advantage is that it can process items in parallel and you can throttle the activity. In my workflow, I am processing 8 files at once.

One thing to be careful of in a workflow is scope.  You shouldn’t assume that variables can be accessed across the entire workflow. That’s why I am specifically scoping some variables so that they will persist across sequences.

I really hope that one day the parallel processing will make its way into the language because frankly, that is the only reason I am using a workflow. And it’s quick. I downloaded the entire directory in little over a minute on my FiOS connection. The workflow will also only download files that are either newer online or not in the specified directory.

If you are looking to learn more about workflows, there is material in PowerShell in Depth.

I hope you find this useful. Consider it my Valentine to you.

NOTE: Because the script is on GitHub, it will always be the latest version, including what you see embedded in this post. Since this article was posted I have made a few changes which may not always be reflected in this article.

Scraping Sysinternals

download-thumbRecently I was conversing with someone about my PowerShell code that downloads tools from the live Sysinternals site. If you search the Internet, you’ll find plenty of ways to achieve the same goal. But we were running into a problem where PowerShell was failing to get information from the site. From my testing and research I’m guessing there was a timing issue when the site is too busy. So I started playing around with some alternatives.

I knew that I could easily get the html content from http://live.sysinternals.com through a few different commands. I chose to use Invoke-RestMethod for the sake of simplicity.

Because $html is one long string with predictable patterns, I realized I could use my script to convert text to objects using named regular expression patterns. In my test script, I dot source this script.

Now for the fun part. I had to build a regular expression pattern. Eventually I arrived at this:

If you have used the Sysinternals site, you’ll know there is also a Tools “subfolder” that appears to be essentially the same as the top level site. My pattern is for the top level site. Armed with this pattern, it wasn’t difficult to create an array of objects for each tool.

sysinternals

Next, I check my local directory and get the most recent file.

Then I can test for files that don’t exist locally or are newer on the site.

If there are needed files, then I create a System.Net.WebClient object and download the files.

The end result is that I can update my local Sysinternals folder very quickly and not worry about timing problems using the Webclient service.

Friday Fun: Find File Locking Process with PowerShell

I was asked on Twitter this morning about a way to find out what process has a lock on a given file. I’m not aware of any PowerShell cmdlet that can do that but I figured there had to be a .NET way and if I could find a code sample I could put something together in PowerShell. After some research I came to the conclusion that programmatic approaches were way above my pay grade but there is still an option I came up with.

I’m a big believer in the right tool for the job and just because you can do something in PowerShell, doesn’t mean you should. The best tool for this job is Handle.exe from the Sysinternals suite which you can download for free. You can specify part of a file name and it will show you the process that has a handle to that file.

handle

But why not have your cake and eat it too? I can take this output and turn it into a PowerShell object. I posted a function recently to convert command line output to objects using regular expressions and named captures. All I need is a regular expression pattern.

converthandle

Or you can create something specific to this task.

This requires Handle.exe. I’ve hardcoded the path which you need to adjust or make sure the utility is in your PATH. But now look what I can do!
getlockingprocess

I know some people get hung up on external dependencies but if it gets the job done I don’t see the issue. Especially when the solution is free to begin with!

Enjoy your weekend.

More PowerShell Trace Window Fun

On my last Friday Fun, I posted an article about using Internet Explorer as a trace window. The idea was to put debug or trace messages in a separate application. I received a comment on the post that suggested I could do a similar thing using the Debug View utility from Sysinternals. This application is used to capture debug messages so I thought I’d give it a try.

After you download it, you will run to manually run it once to accept licensing terms and setup a filter. I’m assuming you don’t regularly use this program for anything else. Depending on your computer you may not need a filter but it is the best way to work with my Debug-Message function.

The essence of this function is the [System.Diagnostics.Debug]::WriteLine($Message,$Category) line. The Category will show up as a prefix to the message in the Debug View window. I set a filter in Debug View on that category, ie PS Trace*. This function, like my IE trace function, relies on a variable $TraceEnabled to be set to $True. If the dbgview.exe process isn’t running, the function starts it. I have hardcoded the path to dbgview.exe which you’ll need to adjust. The easiest approach is to drop dbgview.exe into your Windows folder or modify your %PATH% variable to include your Sysinternals folder.

Using the function is no different than my IE version. In fact, in my demo script all I needed to do was change which script gets dot-sourced.

I added a “Trace” alias to my new Debug View function so don’t have to change anything else. When I run my script using the -Trace parameter, I get a handy trace window like this.

dbgview-trace

What’s handy about this utility is that it is easy to save the results to a file. I also don’t have to deal with messy COM objects. If you don’t setup the filter ahead of time you may have a hard time finding the trace messages from your script. But that’s your choice.

What do you think?

Download SysInternals with PowerShell

sysinternals Like many IT Pros, I’m a big fan of the utilities that make up the Sysinternals suite. A number of years ago, Microsoft created a “live” web directory (http:\\live.sysinternals.com\tools) that allowed you direct access to each utility. While this is very handy, personally I prefer to keep a local version. I used to periodically check the site and update my local folder, but now I can use PowerShell.

My version requires PowerShell 3.0. You need to have the WebClient service running in order to list the Internet files. My script has code to start and stop the service as needed. Although if it detects the service is already running, the script won’t stop it under the assumption that you probably had it running for a reason.

The logic behind the script is pretty simple, if the date of the online version is greater than the local version, copy the file. One thing I ran into though is that there can be a discrepancy with the time stamps due to time zones and/or daylight savings time. I couldn’t find an easy way to take those things into account so I opted to simply test the date and ignore the time.

The next thing I should do is set up a PowerShell scheduled job to run the script on a monthly basis. I hope you’ll let me know how this works out for you.