Friday Fun: A PowerShell Tickler

elmoI spend a lot of my day in the PowerShell console. As you might imagine, I often have a lot going on and sometimes it is a challenge to keep on top of everything. So I thought I could use PowerShell to help out. I created a PowerShell tickler system. Way back in the day a tickler system was something that would give you a reminder about an impending event or activity. What I decided I wanted was a tickler that would display whenever I started PowerShell.

This doesn’t replace my calendar and task list, but it let’s me see events I don’t want to miss right from PowerShell. As I worked through this idea I ended up with a PowerShell module, MyTickle.psm1, that has a number of functions to managing the tickle events, as I call them. From PowerShell I can get events, add, set and remove. I thought the module would make a good Friday Fun post because it certainly isn’t a high impact project but it offers some ideas on building a module and functions that I hope you’ll find useful.

The module for right now is a single file. Here’s the file and below I’ll talk about.

You should be able to copy the code from the WordPress plugin and paste it into a script file locally. You can call it whatever you want just remember to use a .psm1 file extension. The module uses some PowerShell 3.0 features like ordered hashtables but you could revise to have it run in PowerShell 2.0. Fundamentally it should work in both versions.

The events are stored in a CSV file that I reference with a module variable, $TicklePath. The default is a file called mytickler.csv which will be in your WindowsPowerShell folder under Documents. The module also defines a variable called $TickleDefaultDays, with a default value of 7. This displayed events to those that fall within that range. To use, I added these lines to my PowerShell profile.

The result, is that when I launch a new PowerShell session I see something like this (the message about help updates is from something else so disregard):
show-tickle

Here’s how it works.

The Show-TickleEvent function imports events from the CSV file that will happen within the next 7 days. Each object also gets an additional property that is a timespan object for how much time remains. The function then parses event information and constructs a “box” around the events.

I set a foreground color depending on how imminent the event is and then write each event to the console, wrapped in my border.

I purposely used Write-Host so that I could color code events and because I didn’t want the profile to write anything to the pipeline. Because the module is loaded at the start of my PowerShell session, I can always run Show-TickleEvent and event specify a different number of days. If I want objects, then I can use the Get-TickleEvent function which will import the csv events based on a criteria like ID or name. The function uses parameter sets and I create a filter scriptblock depending on the parameter set.

When I import the CSV file, I add types to the properties because otherwise everything would be a string, and then pipe each object to my filter.

These objects come in handy because they can be piped to Set-TickleEvent to modify values like event name, date or comment. Or I can pipe to Remove-TickleEvent to delete entries. The deletion process in essence finds all lines in the CSV file that don’t start with the correct id and creates a new file using the same name.

Finally, after accidentally wiping out event files, I added a simple backup function copies the CSV file to the same directory but with a .BAK file extension. You could specify an alternate path, but it defaults to the WindowsPowerShell folder.

Hopefully I won’t miss important events again, of course assuming I add them to my tickler file. I’ll let you play with Add-TickleEvent to see how that works or you could always modify the CSV file with Notepad.

If you actually use this, I hope you’ll let me know.

Using Types with Imported CSV Data in PowerShell

The Import-CSV cmdlet in PowerShell is incredibly useful. You can take any CSV file and pump objects to the pipeline. The cmdlet uses the CSV header as properties for the custom object.


PS S:\> import-csv .\testdata.csv

Date : 1/18/2012 6:45:30 AM
Name : Data_1
Service : ALG
Key : 1
Size : 25

Date : 1/18/2012 2:17:30 AM
Name : Data_2
Service : AppIDSvc
Key : 2
Size : -30
...

But there is a downside: all of the properties are strings.


PS S:\> import-csv .\testdata.csv | get-member

TypeName: System.Management.Automation.PSCustomObject

Name MemberType Definition
---- ---------- ----------
Equals Method bool Equals(System.Object obj)
GetHashCode Method int GetHashCode()
GetType Method type GetType()
ToString Method string ToString()
Date NoteProperty System.String Date=1/18/2012 6:45:30 AM
Key NoteProperty System.String Key=1
Name NoteProperty System.String Name=Data_1
Service NoteProperty System.String Service=ALG
Size NoteProperty System.String Size=25

The means some tasks such sorting or filtering will fail. But there are ways to get around this limitation. One way is to use an expression to cast a property to a different type. For example, I want to sort my test data on the Date property, but it needs to be a [DateTime] object to sort properly. Here’s how:


PS S:\> import-csv testdata.csv | sort @{expression={$_.date -as [datetime]}} | Select Date,Name,Size

Date Name Size
---- ---- ----
1/9/2012 6:28:30 PM Data_25 26
1/11/2012 11:13:30 AM Data_20 44
1/11/2012 6:28:30 PM Data_23 33
1/13/2012 12:13:30 AM Data_16 42
1/13/2012 4:45:30 PM Data_24 47
...

My output object properties are all still strings. All I did was cast the Date property in the Sort expression. Here’s an example using filtering.


PS S:\> import-csv testdata.csv | where {($_.date -as [datetime]) -le ("1/12/2012" -as [datetime])} | Select Date,Name,Size

Date Name Size
---- ---- ----
1/11/2012 11:13:30 AM Data_20 44
1/11/2012 6:28:30 PM Data_23 33
1/9/2012 6:28:30 PM Data_25 26

These examples are only producing results. More likely I want to import the CSV file as typed objects. Assuming you know in advance the property names and what types you want to use, here’s how you could achieve this.


PS S:\> $data=import-csv testdata.csv | Select @{Name="Date";Expression={[datetime]$_.Date}}, Name,Service,@{Name="Key";Expression={[int32]$_.Key}},@{Name="Size";Expression={[int32]$_.Size}}

I imported my CSV file and piped it to Select-Object, using hash tables to redefine the properties with appropriate types. Import-CSV writes a PSCustomObject to the pipeline anyway so using Select-Object has no effect other than giving me typed properties.


PS S:\> $data | get-member

TypeName: Selected.System.Management.Automation.PSCustomObject

Name MemberType Definition
---- ---------- ----------
Equals Method bool Equals(System.Object obj)
GetHashCode Method int GetHashCode()
GetType Method type GetType()
ToString Method string ToString()
Date NoteProperty System.DateTime Date=1/18/2012 6:45:30 AM
Key NoteProperty System.Int32 Key=1
Name NoteProperty System.String Name=Data_1
Service NoteProperty System.String Service=ALG
Size NoteProperty System.Int32 Size=25

Now I can use $data objects anyway I want.


PS S:\> $data | where {$_.size -ge 40 -AND $_.key -le 10}

Date : 1/17/2012 11:57:30 PM
Name : Data_3
Service : Appinfo
Key : 3
Size : 42

I’m working on something that takes this idea to the next level but it isn’t quite ready for prime time. But I hope this will help manage imported objects a bit more efficiently and let you really take advantage of the PowerShell pipeline.