Friday Fun: The Measure of a Folder

rulerLast week, I demonstrated how to measure a file with PowerShell. This week let’s go a step further and measure a folder. I’m going to continue to use Measure-Object although this time I will need to use it to measure numeric property values.

Here’s the complete function after which I’ll point out a few key points.

The command will use the current path or you can pipe in a directory name, or the output of a Get-ChildItem expression as I show in the help examples. One thing I added is that I test the path to make sure it is a file system path because anything else wouldn’t really work.

I resolve the path so that I can get the actual name of the current location (.) and then test the provider. The other interesting feature of this function is that I format the results on the fly.

The function has a Unit parameter which has a default value of bytes. But you can also specify one of the PowerShell numeric shortcuts like KB or GB. In the function I use a Switch construct to create a custom property on the fly.

Because there can only be a single value for $Unit, I’m including the Break directive so that PowerShell won’t try to process any other potential matches in the Switch construct. Realistically, there’s a negligible performance gain in this situation but I wanted you to see how this works.


I’ve taken some basic commands you could use interactively such as Get-ChildItem and Measure-Object and built a tool around them using a hashtable of properties to create a custom object. I hope you pick up a tip or two from this. If you have any questions about what I’m doing or why, please don’t hesitate to ask because someone else might have the same question.

Enjoy your weekend!

PowerShell Clean Up Tools

021913_2047_WordTest1.pngA few years ago I think I posted some PowerShell clean up tools. These were functions designed to help clear out old files, especially for folders like TEMP. Recently I decided to upgrade them to at least PowerShell 3.0 to take advantage of v3 cmdlets and features. I use these periodically to clean out my temp folders. I should probably set them up as a scheduled job to run monthly, but I haven’t gotten around to that yet. In the mean time, let me show you what I have.

First, I have a function to delete files from a directory that were last modified after a given date.

The function supports -WhatIf so that if I run it, Remove-Item will only show me what it would delete. I use a hashtable to build a set of parameters to splat to Get-ChildItem. I love this technique for building dynamic commands. I use this command on my temp folders to delete files older than the last time the computer booted. My assumption is that anything in TEMP older than the last boot time is fair game for deletion.

If you look at this function, you’ll notice that it only affects files. It leaves folder alone. I suppose I could modify it to remove old folders as well, but there’s a chance the folder might have a newer file somewhere in the hierarchy that shouldn’t be deleted so I decided to simply focus on old files. To clean up folders, I use this:

This command looks for empty folders, or more precisely folders with no files. The function gets all of the top-level folders in the specified path and then searches each folder recursively for any files. If no files are found, the folder is removed.

The two tools are great on their own. To put them to use, I created a small script.

I think of the script as a “canned” PowerShell session. Instead of my typing the commands to clean up a few folders, I simply run the script. I inserted some Write-Host commands so I could know what the script was doing. I clean out all the old files first and then make a second pass to delete any empty folders. I trust it goes without saying that if you want to use these, you have to test in a non-production environment.


Getting Top Level Folder Report in PowerShell

One of the sessions I presented recently at TechDays San Francisco was on file share management with PowerShell. One of the scripts I demonstrated was for a function to get information for top level folders. This is the type of thing that could be handy to run say against the root of your shared users folder. Or the root of a group share where each subfolder is a share that belongs to a different group. My function takes advantage of a new feature for Get-ChildItem that makes it much easier to retrieve only file or directories. Here’s my Get-FolderSize function.

The function defaults to the local path and gets a collection of all of the top level folders, that is, those found directly in the root. The function then takes the collection of folders and pipes them to ForEach-Object. Most of the time we only use the Process scriptblock with ForEach-Object, but I want to take advantage of the Begin and End blocks as well. In the Begin scriptblock I measure all of the files in the root of the parent path and create a custom object that shows the number of files and total size in bytes. I’m going to get this same information for each child folder as well.

The process scriptblock does just that for each top level folder. This version of my function uses Write-Progress to display progress and in the End script block I have code to complete the progress bar, although It works just fine without it.

Other techniques I’d like to point out are the use of splatting and error handling. You’ll notice that I’m using the common -ErrorVariable parameter. After exploring the different types of exceptions I decided I could easily display any errors and the paths In the Catch block. I’m using Write-Warning, but this could just as easily be written to a text file.

The function writes an object like this for every folder.

Here’s an example of complete output:

foldersizeBecause I’ve written objects to the pipeline, I could pipe this to Out-Gridview, export to a CSV file or create an HTML report.

This is just a taste of what you can accomplish with some basic PowerShell commands.


File Age Groupings with PowerShell

I’m always talking about how much the object-nature of PowerShell makes all the difference in the world. Today, I have another example. Let’s say you want to analyze a directory, perhaps a shared group folder for a department. And you want to identify files that haven’t been modified in a while. I like this topic because it is real world and offers a good framework for demonstrating PowerShell techniques.

You would like to divide the files into aging “buckets”. Let’s begin by getting all of the files. I’m using PowerShell 3.0 so you’ll have to adjust parameters if you are using 2.0. You can run all of this interactively in the console, but I think you’ll find using a script much easier.

Now, let’s add a new property, or member, to the file object called FileAgeDays which will be the value of the number of days since the file was last modified, based on the LastWriteTime property. We’ll use the Add-Member cmdlet to define this property.

The new property is technically a ScriptProperty so that we can run a scriptblock to define the value. In this case we’re subtracting the LastwriteTime value of the each object from the current date and time. This will return a TimeStamp object but all we need is the TotalDays property which is cast as an integer, effectively rounding the value. In a pipelined expression like Select-Object you would use $_ to indicate the current object in the pipeline. Here, we can use $this.

Next, we’ll add another script property to define our “bucket” property.

The script block can be as long as you need it to be. Here, we’re using an If/ElseIf construct based on the FileAgeDays property we just created. If we look at $files now, we won’t see these new properties.


But that is because the new properties aren’t part of the default display settings. So we need to specify them.


Now, we can group the objects based on these new properties.

fileage-03Or perhaps we’d like to drill down a bit more.

Now we’ve added a new member to the GroupInfo object that will show the total size of all files in each group by MB. Don’t forget to use -Passthru to force PowerShell to write the new object back to the pipeline so it can be saved in the grouped variable. Finally, the result:


And there you go. Because we’re working with objects, adding new information is really quite easy. Certainly much easier than trying to do something like this in VBScript! And even if you don’t need this specific solution, I hope that you picked up a technique or two.

Friday Fun: Edit Recent File

As you might imagine I work on a lot of PowerShell projects at the same time. Sometimes I’ll start something at the beginning of the week and then need to come back to it at the end of the week. The problem is that I can’t always remembered what I called the file. So I end up listing files, sorting by last write time and then looking at the last 10 or so. Time to be more efficient so I came up with a little PowerShell v3 function called Edit-RecentFile.

#requires -version 3.0

Function Edit-RecentFile {

<# .Synopsis Open selected files for editing in the PowerShell ISE .Description This command will list the most recently modified files in the specified directory. By default the command will list the 25 most recently modified files. Using Out-Gridview as an interface, you can select the files you wish to edit and they will open in the PowerShell ISE. If you are already in the ISE then each file will open in a new tab. This command requires PowerShell v3. .Example PS C:\> edit-recentfile d:\myscripts -last 10
PS C:\> edit-recentfile c:\work\*.txt

[ValidateScript({Test-Path $_})]

Write-Verbose "Getting $last recent files from $path."

#get last X number of files from the path and pipe to Out-Gridview
$files = Get-ChildItem -Path $Path -file -Recurse:$Recurse |
Sort-Object -Property LastWriteTime -Descending |
Select-Object -First $Last -Property FullName,LastWriteTime,Length,Extension |
Out-GridView -Title "Select one or more files to edit and click OK" -PassThru

#if files were selected, open them in the ISE
if ($files) {
Write-Verbose "Opening`n$($files.fullname | out-string)"
#if in the ISE use PSEDIT
if ($ -match "ISE") {
Write-Verbose "Detected the PowerShell ISE"
psedit $files.Fullname
else {
#otherwise assume we're in the console
Write-Verbose "Defaulting to PowerShell console"
ise ($files.FullName -join ",")
} #if $files

} #close Edit-RecentFile

The core section gets the directory listing of the most recently modified files but then pipes them to Out-Gridview.


In PowerShell v3, Out-Gridview has been extended so that you can select one or more items and pass them back to the pipeline.

$files = Get-ChildItem -Path $Path -file -Recurse:$Recurse |
Sort-Object -Property LastWriteTime -Descending |
Select-Object -First $Last -Property FullName,LastWriteTime,Length,Extension |
Out-GridView -Title "Select one or more files to edit and click OK" -PassThru

Once I have the collection of files, I can use the FullName property and open them in the PowerShell ISE. From the console I can use the ise alias and pass a comma separated list of files. The fun part is turning the array of file paths into a comma separated string. But that is easily done using the -Join operator.

ise ($files.FullName -join ",")

If I am in the PowerShell ISE already, I can do a similar thing using the PSEDIT command, although this takes the array without any reformatting.

if ($ -match "ISE") {
Write-Verbose "Detected the PowerShell ISE"
psedit $files.Fullname

The function has a default path set to C:\Scripts which you might want to change. By default the function will only look in the root of the folder, but you can use -Recurse. The download file will also create an alias erf, for the function.

Download Edit-RecentFile and let me know how it works out for you.