Friday Fun: A SysInternals PowerShell Workflow

Over the years I’ve come up with a number of PowerShell tools to download the SysInternals tools to my desktop. And yes, I know that with PowerShell 5 and PowerShellGet I could download and install a SysInternals package. But that assumes the package is current.  But that’s not really the point. Instead I want to use today’s Friday Fun to offer you an example of using a workflow as a scripting tool. In this case, the goal is to download the SysInternals files from the Internet.

First, you’ll need to get a copy of the workflow from GitHub.

A workflow command is like a function, in that you need to load it into your PowerShell session such as dot sourcing the file.

. c:\scripts\Update-SysinternalsWorkflow.ps1

This will give you a new command.


The workflow can now be executed like any other command.


The workflow’s main advantage is that it can process items in parallel and you can throttle the activity. In my workflow, I am processing 8 files at once.

One thing to be careful of in a workflow is scope.  You shouldn’t assume that variables can be accessed across the entire workflow. That’s why I am specifically scoping some variables so that they will persist across sequences.

I really hope that one day the parallel processing will make its way into the language because frankly, that is the only reason I am using a workflow. And it’s quick. I downloaded the entire directory in little over a minute on my FiOS connection. The workflow will also only download files that are either newer online or not in the specified directory.

If you are looking to learn more about workflows, there is material in PowerShell in Depth.

I hope you find this useful. Consider it my Valentine to you.

NOTE: Because the script is on GitHub, it will always be the latest version, including what you see embedded in this post. Since this article was posted I have made a few changes which may not always be reflected in this article.

More Fun Getting PowerShell User Groups

A few days ago I posted a PowerShell function to retrieve information about PowerShell user groups. That function returned basic group information like this.

Each group on the site has its own page which is what that Link property is for. So it didn’t take much work to use the same techniques as my original post to scrape information from that page. Again, I needed to analyze the source code to determine what classes and properties to use. But the final function, isn’t that much different than the first one.

Now I can get the group detail directly from PowerShell.

If you have both commands, you can even combine them.

This isn’t too bad.

You could use PowerShell to get details for every single group but that can be time consuming as processing is done sequentially. One way you might improve performance is my taking advantage of the parallel foreach feature in a PowerShell workflow. I wrote another function, really more as a proof of concept that defines a nested workflow. Within this workflow, it processes a collection of links in parallel in batches of 8.

Because workflows are intended to run isolated, I had to incorporate code from Get-PSUserGroupDetail, instead of trying to call it directly. Here’s the complete function.

But even with parallel processing, this is still not a speedy process. Running the command on my Windows 8.1 box with 8GB of RAM and a very fast FiOS connection still took about 2 minutes to complete. But I suppose if you don’t mind waiting here’s what you can expect.

I will say, that having all of this information is fun to play with.

Or you could do something like this.

My last function on the topic is called Show-PSUserGroup. The central command runs my original Get-PSUserGroup function which pipes the results to Out-Gridview. From there you can select one or more groups and each group’s link will open up in your browser.

Clicking OK opens each link in my browser.

If you’ve collected all of my functions, I recommend creating a module file. I have all of them in a module file called PSUsergroups.psm1. All you need at the end is an export command.

Save the file in the necessary module location and your commands are ready when you are.

NOTE: If you run a PowerShell User Group and you are not registered on this site, I strongly encourage you to do so. Otherwise you are making it very hard for people to find you.

Runspaces, Remoting and Workflow, Oh My!

talkbubbleThe other day on Twitter I saw a message about new script in the Microsoft Script Center on getting remote event logs with WMI. So I took a look at the script. If you take a minute to look at the script you’ll quickly realize this is not a script for beginners. My initial thought was “Why?”. We already have cmdlets in PowerShell for querying event logs on remote computers. I realize the script was trying to avoid some of the issues we run into with WMI and I can’t question the effectiveness of his function. It works and with out a doubt using runspaces like this is faster than cmdlets.

My concern when I see scripts like this is that someone new to PowerShell will see it and run to the hills thinking they’ll never be able to use PowerShell and that is definitely not the case. So I decided to see what I could come up with that used a more IT Pro friendly cmdlet-based approach. I wanted to write something that most of you could have come up with.

My first attempt is a function that uses Invoke-Command to run the Get-Eventlog cmdlets in a remote session. In the function I define a scriptblock that gets all the event logs with records, and then gets all non-information or SuccessAudit events from those logs that have happended since midnight yesterday. My function supports credentials and takes advantage of a few other features from Invoke-Command.

The function works and took a about 1 1/2 min to query 8 machines in my virtual test environment. There is definitely some overhead when using Invoke-Command, but the trade off is a script that is a little easier to develop and maintain.

Then I thought, what about a workflow? I’m querying event logs but there’s no reason I can’t query all of them simultaneously. Here’s my workflow that does essentially the same thing as my function.

Interestingly, in my tests the workflow took about the same amount of time to run. But this is a shorter script to develop because all of the features like remoting, credentials and jobs are automatically part of the workflow. There is a potential downside in that all the remote machines must be running PowerShell 3.0. This workflow is also a great example in that workflows aren’t always the answer. There’s nothing really here, other than potentially the use of parallelism, that makes this a better choice than my function.

My last concern with the gallery script, and I don’t know if this would have an effect on its performance, is that all the event logs are rolled up in a property for each computer. This means you have to take some further steps to expand and format the results. My function and workflow, because they rely on Get-Eventlog, are formatted and ready to go.

What I haven’t tried yet, is how this same task can be done with Get-WinEvent or Get-CimInstance. The latter helps avoid some of the issues with WMI and might perform better. If I have time, I’ll get back to you with my results. But in the meantime, what do you think about all of this?

Using runspaces in PowerShell has a place, but definitely requires an advanced skill set and a willingness to accept a tradeoff of a more complicated script to develop and maintain (especially if someone else has to) with improved performance. I can see where using a runspace approach makes sense when you have 1000s of computers. But I might also argue that if that is your environment, you probably have full-blown management suites. Yes, I know there will always be exceptions. But for the majority of you, are you happy writing scripts that use existing cmdlets or do you feel obligated to learn .NET before you can use PowerShell?

I tried this using Get-CIMInstance. Yes, this requires PowerShell 3 remotely (unless you take an extra step to setup a DCOM CIM session) and the use of the WSMan protocol, but this performs suprisingly well and only takes a few lines of PowerShell.

I ran this in my test environment and it took about 30 seconds to return 188 event log entries. Because we’re using WSMAN we avoid some of the issues with RPC and DCOM. So here is a solution, at least in this case, that is as fast as using runspaces but only took a few minutes to write and is easy to follow.

Everything we do as IT Pros is a matter of balancing trade-offs and working within a set of limitations.

Techmentor Las Vegas 2013 Session Materials

TMVSK4I had a terrific time at Techmentor last week in Las Vegas. I did 2 3-hour sessions. The longer sessions are intended to allow speakers time to go deeper into content and offer more detailed coverage than what you might get at a conference like TechEd. From my informal survey of attendees, many people enjoyed the longer sessions with a few wanting even longer. I expect we’ll see this longer format at the Techmentor conference next year.

Because I had 3 hours I was able to cover a lot of material. One of my sessions was a hands-on-lab with some exercises. As promised I’ve updated my slide decks (primarily for clarity) and assembled all of my PowerShell scripts and demos. While anyone is welcome to download them, unless you were in my session you won’t have the necessary context. For example, most of my presentations were live demonstrations with the slides serving as notes and an agenda. But feel free to download and try it all out.

Much of the content for my sessions are drawn from my books.

Each zip file contains a PDF of my slide deck and all of my PowerShell samples. Put all the samples in the same directory. All samples are provided as learning material and are not intended for immediate production use.

Techmentor Las Vegas 2013 PowerShell Workflows
Techmentor Las Vegas 2013 Automating AD with PowerShell

PowerShell Workflow Bug

There is bug with workflows in PowerShell v3 that you might run into. I kept banging into it until I tracked it down.  The problem occurs if you try to run a workflow from a PSDrive that you have added. For example, I have a PSDrive (S) which is mapped to C:\Scripts. When I try to run a workflow from this drive, I will get an error.

PS S:\> get-foo
Cannot perform operation because operation "ResetRunspaceState" is not valid. Remove operation "ResetRunspaceState",
or investigate why it is not valid.
+ CategoryInfo : InvalidOperation: (:) [], ParentContainsErrorRecordException
+ FullyQualifiedErrorId : InvalidOperation
+ PSComputerName : [localhost]

The workaround is to simply change to your C: drive, or any PSDrive that PowerShell loads by default. Then the workflow will run.

PS C:\> get-foo

This has been filed on Connect but I thought I’d share the workaround so you don’t have to bang your head. And if you are curious about workflows, there will be a good chapter in the forthcoming PowerShell in Depth book.