Getting Top Level Folder Report in PowerShell

One of the sessions I presented recently at TechDays San Francisco was on file share management with PowerShell. One of the scripts I demonstrated was for a function to get information for top level folders. This is the type of thing that could be handy to run say against the root of your shared users folder. Or the root of a group share where each subfolder is a share that belongs to a different group. My function takes advantage of a new feature for Get-ChildItem that makes it much easier to retrieve only file or directories. Here’s my Get-FolderSize function.

The function defaults to the local path and gets a collection of all of the top level folders, that is, those found directly in the root. The function then takes the collection of folders and pipes them to ForEach-Object. Most of the time we only use the Process scriptblock with ForEach-Object, but I want to take advantage of the Begin and End blocks as well. In the Begin scriptblock I measure all of the files in the root of the parent path and create a custom object that shows the number of files and total size in bytes. I’m going to get this same information for each child folder as well.

The process scriptblock does just that for each top level folder. This version of my function uses Write-Progress to display progress and in the End script block I have code to complete the progress bar, although It works just fine without it.

Other techniques I’d like to point out are the use of splatting and error handling. You’ll notice that I’m using the common -ErrorVariable parameter. After exploring the different types of exceptions I decided I could easily display any errors and the paths In the Catch block. I’m using Write-Warning, but this could just as easily be written to a text file.

The function writes an object like this for every folder.

Here’s an example of complete output:

foldersizeBecause I’ve written objects to the pipeline, I could pipe this to Out-Gridview, export to a CSV file or create an HTML report.

This is just a taste of what you can accomplish with some basic PowerShell commands.

 

Friday Fun: Get Latest PowerShell Scripts

Probably like many of you I keep almost all of my scripts in a single location. I’m also usually working on multiple items at the same time. Some times I have difficult remembering the name of a script I might have been working on a few days ago that I need to return to. The concept is simple enough: search my script directory for PowerShell files sorted by the last write time and look for the file I need.

That’s a lot to type so why not build a function to do the work for me? In fact, since I spend a lot of time in the PowerShell ISE, why not produce graphical output? The easiest way is to pipe results to Out-Gridview. So after a little tinkering, I came up with Get-LatestScript.


Function Get-LatestScript {

[cmdletbinding()]

Param(
[Parameter(Position=0)]
[ValidateScript({Test-Path $_})]
[string]$Path=$global:ScriptPath,
[Parameter(Position=1)]
[ValidateScript({$_ -ge 1})]
[int]$Newest=10
)

if (-Not $path) {
$Path=(Get-Location).Path
}

#define a list of file extensions
$include="*.ps1","*.psd1","*.psm1","*.ps1xml"

Write-Verbose ("Getting {0} PowerShell files from {1}" -f $newest,$path)

#construct a title for Out-GridView
$Title=("Recent PowerShell Files in {0}" -f $path.ToUpper())

Get-ChildItem -Path $Path -Include $include -Recurse |
Sort-Object -Property lastWriteTime -Descending |
Select-Object -First $newest -Property LastWriteTime,CreationTime,
@{Name="Size";Expression={$_.length}},
@{Name="Lines";Expression={(Get-Content $_.Fullname | Measure-object -line).Lines}},
Directory,Name,FullName |
Out-Gridview -Title $Title

}

I decided to try something different with the Path variable. I set the default to a global variable, ScriptPath. The idea is that in your PowerShell profile, you’ll have a line like this:


$ScriptPath="C:\Scripts"

If the function finds this variable, it will use it. Otherwise it will use the current location. Notice I’m also using a validation attribute to verify the path. By default the function returns the 10 newest PowerShell files, based on the last write time. The number of files can be controlled by the -Newest parameter.

In the heart of the script is a one-line expression to find all matching files in the script folder and subfolders.


Get-ChildItem -Path $Path -Include $include -Recurse |
Sort-Object -Property lastWriteTime -Descending

These files are then sorted on the LastWriteTime in descending order and then I only select the first $newest number of files.


| Sort-Object -Property lastWriteTime -Descending |
Select-Object -First $newest ...

I am only interested in a few file properties so I select them. I also add a custom property to measure the file and get the number of lines in the script.


...-Property LastWriteTime,CreationTime,
@{Name="Size";Expression={$_.length}},
@{Name="Lines";Expression={(Get-Content $_.Fullname | Measure-object -line).Lines}},
Directory,Name,FullName

Finally the results are piped to Out-Gridview.


... | Out-Gridview -Title $Title

For now, I have to manually open the file. Perhaps I’ll create a WinForm or use ShowUI to integrate it into the PowerShell ISE.

You can download Get-LatestScript which includes comment based help.

Get File Utilization by Extension

In the past I’ve posted a few PowerShell functions that provide all types of file and folder information. The other day I had a reason to revisit one of them and I spent a little time revising and expanding. This new function, Get-Extension will search a given folder and create a custom object for each file extension showing the total number of files, the total size, the average size, the maximum size and the largest size. At it’s core, the function takes output from Get-ChildItem and pipes it to Measure-Object. But I’ve incorporated features such as filtering and the ability to run the entire function as a background job.

By default, the function searches the top level of your $ENV:Temp folder and returns a custom object for each file type.

Here’s how this works.

The function uses a few parameters from Get-ChildItem, like -Include, -Exclude and -Force. If you use one of the filtering parameters, then you also need to use -Recurse. You can specify it, or the function will automatically enable it if it detects -Include or -Exclude.

Obviously (I hope), this only works on the file system. But I went ahead and added some code to verify that the specified path is from the FileSystem provider.

Normally, I’m not a big fan of Return. But in this situation it is exactly what I want since I want to terminate the pipeline. I could have also thrown an exception here but decided not to get that wild. Assuming the path is valid, the function builds a command string based on the specified parameters.

The function will invoke this string using Invoke-Expression and filter out any folders since all I care about are files.

The results are then grouped using Group-Object. Each extension group is piped to Measure-Object to calculate the statistics based on the file’s length property.

Lastly, the function creates a custom object representing each file extension using the New-Object cmdlet.

Because I’m writing an object tot he pipeline you can further sort, filter, export or whatever. This is what makes PowerShell so flexible and valuable to IT Pros.

One thing I quickly realized, was that scanning a large folder such as Documents folder or a file share UNC, could take a long time. I could use Start-Job with my original function, but it was a bit awkward. So I decided to include -AsJob as a parameter and move the job command into the function itself. This works because I take the entire core command and wrap it in a script block.

Because of scope the scriptblock needs parameters so I can pass it my command string and the Path variable which are used within the scriptblock. After $sb has been defined, if -AsJob was specified, the function uses Start-Job to create a background job. Otherwise, it uses Invoke-Command to execute it interactively.

Use the normal job cmdlets to get the results and manage the job. But now I can run something like this:

As always I hope you’ll let me know how this works for you. The complete script has comment based help and an optional line to uncomment at the end to create an alias for the function.

Download Get-Extension.

Get Process Detail

The other day I posted a snippet of code that I as using to monitor process memory utilization for a few web browsers. I thought the information and technique were useful enough to modularize in the form of a function. Perhaps I want to check working set on a different process or even on a different computer. This is what I came up with. Continue reading

Remote PowerShell Performance Comparison

Fellow Windows PowerShell MVP Marco Shaw clued me in on a Microsoft blog post that did a terrific job of comparing, from a performance perspective the different PowerShell 2.0 techniques you can use when managing remote computers. The results are pretty much as I would expect.

Continue reading