In honor of today's festivities, at least in the United States, I thought we'd look at some scary PowerShell. I have seen a lot of scary things in blog posts, tweets and forum discussions. Often these scary things are from people just getting started with PowerShell who simply haven't learned enough yet to know better. Although I have seen some of these things from people who appear to be a bit more experienced. I find these things scary because I think they are "bad" examples of how to use PowerShell. Often these examples will work and provide the desired result, but the journey is through a terrifying forest. Please note that PowerShell style is subjective but I think many of the core concepts I want to show you are sound.
ManageEngine ADManager Plus - Download Free Trial
Exclusive offer on ADManager Plus for US and UK regions. Claim now!
First, here is a chunk of scary PowerShell code. This is something I wrote based on a number of "bad" techniques I've encountered.
$h = "Name,Type,Size,LastModified,Path" $h > c:\work\report.csvc $files = dir c:\work\ -file -Recurse | where {$_.length -gt 1048576} | where {$_.LastWriteTime -gt "1/1/2014"} foreach ($file in $files) { $filetype = $file.name.substring($file.Name.LastIndexOf(".")) $data = $file.name+","+$filetype+","+$file.Length+","+$file.LastWriteTime+","+$file.FullName $data >> c:\work\report.csv } }
This code will create a CSV file with files in C:\Work that are over a certain size and modified after January 1, 2014. When I see code like this I am pretty confident the author is coming from a VBScript background. To me, I see a lot of energy working with text. I'll come back to that point in a moment.
First, filtering is a bit ugly. Unless you have a specific byte value in mind, use shortcuts like 1MB which is easier to understand than 104876. You can also combine filters into one.
$files = dir c:\work\ -file -Recurse | where {$_.length -gt 1MB –AND $_.LastWriteTime -gt "1/1/2014"}
I cringe when I see people trying to concatenate values in PowerShell. In this case it simply isn't needed. But for the sake of learning, if you really needed to build a string, take advantage of variable expansion. Here I will need to use a sub-expression.
$data = "$($file.name),$($filetype),$($file.Length),$($file.LastWriteTime),$($file.FullName)"
But in the example of bad code ,instead of manually creating a CSV file, PowerShell can do that for you with Export-CSV. In the code sample, the header is different than the property names, but that's OK. We can use a hashtable with Select-Object and create something new.
dir c:\work\ -file -Recurse |
where {$_.length -gt 1MB -AND $_.LastWriteTime -gt "1/1/2014"} |
Select-Object Name,Extension,@{Name="Size";Expression={$_.length}},
@{Name="LastModified";Expression={$_.Lastwritetime}},
@{Name="Path";Expression={$_.fullname}} |
Export-Csv -Path C:\work\report2.csv -NoTypeInformation
The original code used text parsing to get the file extension. But if you pipe a file object to Get-Member, you would discover there is a property called Extension. When building scripts, pipe objects to Get-Member or Select-Object with all properties to discover what you have to work with. The original code is written with the assumption that the CSV file will be used outside of PowerShell. If that is the case, then export it without type information. But if you think you will import the data back into PowerShell, include the type information because it will help PowerShell reconstruct the objects. You would learn all of this by looking at help and examples for Export-CSV.
The example above is technically a one-line command. But it doesn't have to be. It might make more sense to break things up into discrete steps.
$files = dir c:\work\ -file -Recurse |
where {$_.length -gt 1MB -AND $_.LastWriteTime -gt "1/1/2014"}
$data = foreach ($file in $files) {
$file | Select-Object -property Name,Extension,
@{Name="Size";Expression={$_.length}},
@{Name="LastModified";Expression={$_.Lastwritetime}},
@{Name="Path";Expression={$_.fullname}}
}
$data | Export-Csv -Path C:\work\report2.csv -NoTypeInformation
This code is sort of a compromise and isn't too difficult to follow, even if you are new to PowerShell. This would give you the same result.
$files = dir c:\work\ -file -Recurse |
where {$_.length -gt 1MB -AND $_.LastWriteTime -gt "1/1/2014"}
$data = $files | Select -property Name,Extension,
@{Name="Size";Expression={$_.length}},
@{Name="LastModified";Expression={$_.Lastwritetime}},
@{Name="Path";Expression={$_.fullname}}
$data | Export-Csv -Path C:\work\report2a.csv -NoTypeInformation
Sometimes using the ForEach enumerator is faster than using ForEach-Object in a pipeline. You have to test with Measure-Command. If you are running PowerShell v4, you can take advantage of the new Where() method which can dramatically improve performance.
$files = (dir c:\work\ -file -Recurse).where({$_.length -gt 1MB -AND $_.LastWriteTime -gt "1/1/2014"})
$files | Select Name,Extension,@{Name="Size";Expression={$_.length}},
@{Name="LastModified";Expression={$_.Lastwritetime}},
@{Name="Path";Expression={$_.fullname}} |
Export-Csv -Path C:\work\report3.csv –NoTypeInformation
By now I hope you can see how this code is taking advantage of cmdlets and the pipeline. For example, Export-CSV has a –NoClobber parameter which prevents you from overwriting an existing file. You can't do that with legacy redirection operators like > and >> without additional commands.
The final step with my scary code, which now isn't quite so scary I hope, is to turn this into something re-usable. If you put the above into a script as-is, you would have to edit the file every time you wanted to check a different folder or export to a different file. This is where we can turn it from a pumpkin into a fantastic carriage, that won't revert at midnight.
Function Get-MyFiles {
[cmdletbinding()]
Param(
[ValidateScript({Test-Path $_})]
[string]$Path=".",
[ValidateNotNullorEmpty()]
[scriptblock]$Filter={$_}
)
Write-Verbose "Processing $path"
#get files
$files = dir -Path $path -Recurse
#filter if necessary
$files | where -FilterScript $filter |
Select-Object Name,Extension,@{Name="Size";Expression={$_.length}},
@{Name="LastModified";Expression={$_.Lastwritetime}},
@{Name="Path";Expression={$_.fullname}}
} #end function
This function gives me a flexible tool. All I need to do is specify a path, although it defaults to the current directory, and some sort of filtering script block. The default is to display everything. Now I can run a command \ like this:
get-myfiles d:\data-Filter {$_.length -gt 5mb}
You'll notice that my function doesn't export anything to a CSV file. Of course not. The function's only purpose is to get files and display a subset of properties. Because the function writes to the pipeline I can do whatever I need. Perhaps today I need to export to a CSV file but tomorrow I want to create formatted table saved to a text file.
get-myfiles D:\VM\ -filter {$_.extension -match "vhd" -AND $_.lastwritetime -le "10/1/2014"} | export-csv c:\reports\oldvhd.csv
get-myfiles c:\work -Verbose -Filter {$_.length -gt 5mb} | format-table | out-file c:\reports\WorkFiles.txt –encoding ascii
I hope PowerShell in general doesn't frighten you. As the saying goes, we fear that which we don't understand. But once you understand some basic PowerShell principals and concepts I think you'll find it not quite as terrifying.
What scary PowerShell have you come across
I personally thought my “PowerShell Christmas Tree” was pretty darn scary when it was causing the DuoCore CPU on the laptop to max right out 😉