I use hash tables quite a bit and with the impending arrival of PowerShell 3.0 I expect even more so. PowerShell v3 allows you to define a hash table of default parameter values. I'm not going to to cover that feature specifically, but it made me realize I needed a better way to export a hash table, say to a CSV file. So I put together a few functions to do just that.
ManageEngine ADManager Plus - Download Free Trial
Exclusive offer on ADManager Plus for US and UK regions. Claim now!
To walk you through them, here's a simple hash table.
$hash=@{Name="jeff";pi=3.14;date=Get-Date;size=3 }
$hash
Name Value
---- -----
Name jeff
pi 3.14
date 2/2/2012 10:04:54 AM
size 3
I want to export this to a CSV file, but because PowerShell is all about the objects, I want to be sure to get the type information as well. Otherwise when I go to importing, everything will be a string. Here's what I can expect to export:
$hash.GetEnumerator() | Select Key,Value,@{Name="Type";Expression={$_.value.gettype().name}}
Key Value Type
--- ----- ----
Name jeff String
pi 3.14 Double
date 2/2/2012 10:05:57 AM DateTime
size 3 Int32
That looks good. I can take this command and run it through Export-CSV which gives me this file:
#TYPE Selected.System.Collections.DictionaryEntry
"Key","Value","Type"
"Name","jeff","String"
"pi","3.14","Double"
"date","2/2/2012 10:05:57 AM","DateTime"
"size","3","Int32"
Perfect. Later, I will need to import this file and recreate my hash table. I can use Import-CSV as a starting point.
PS C:\> import-csv hash.csv
Key Value Type
--- ----- ----
Name jeff String
pi 3.14 Double
date 2/2/2012 10:05:57 AM DateTime
size 3 Int32
Good so far. All I need to do is create a hash table and add each entry to it. I could do something like this:
Import-csv hash.csv | foreach -begin {$hash=@{}} -process {$hash.Add($_.Key,$_.Value)} -end {$hash}
But if I do this, everything will be a string. Since I have Type information, let's use it.
Import-Csv -Path $path | ForEach-Object -begin {
#define an empty hash table
$hash=@{}
} -process {
<#
if there is a type column, then add the entry as that type
otherwise we'll treat it as a string
#>
if ($_.Type) {
$type=[type]"$($_.type)"
}
else {
$type=[type]"string"
}
Write-Verbose "Adding $($_.key)"
Write-Verbose "Setting type to $type"
$hash.Add($_.Key,($($_.Value) -as $type))
} -end {
#write hash to the pipeline
Write-Output $hash
}
Here I'm taking the Type value from the import and turning it into a System.Type object which I can then use to cast each value to the correct type. I'm checking for the Type property because I might have a CSV file without it. But as long as I have column headings for Key and Value this will work.
I turned all of this into a pair of advanced functions, Export-HashtoCSV and Import-CSVtoHash.
PS C:\> $hash | Export-HashtoCSV myhash.csv
PS C:\> $newhash=Import-CSVtoHash .\myhash.csv -verbose
VERBOSE: Importing data from .\myhash.csv
VERBOSE: Adding Name
VERBOSE: Setting type to string
VERBOSE: Adding pi
VERBOSE: Setting type to double
VERBOSE: Adding date
VERBOSE: Setting type to System.DateTime
VERBOSE: Adding size
VERBOSE: Setting type to int
VERBOSE: Import complete
PS C:\> $newhash
Name Value
---- -----
Name jeff
pi 3.14
date 2/2/2012 10:05:57 AM
size 3
PS C:\> $newhash.date
Thursday, February 02, 2012 10:05:57 AM
PS C:\> $newhash.pi.gettype().name
Double
This certainly fulfills my needs. You can download a script file with both functions, including help. As always, enjoy and I hope you'll let me know how these work out for you.
These functions work fine as long as the hash table contains simple objects. This won’t work with nested hash tables or more complex values like a WMI object. For that I would need to use XML to properly handle the object. Guess I’ll have to take this another step. Still, I think most people use hash tables with pretty simple values so hopefully what I have here will meet most requirements.
Why not just do it like this: It will work even with nested hashes.
Because that is too easy and obvious. 🙂 I started all of this by thinking about CSV files and never got my head to think about anything else. I guess my original idea was to be able to take a CSV file that might have manually been created and turn it into a hash table. Clearly, if we want to to serialize and deserialize right from PowerShell, this is much better way to go. Thanks for keeping me on my toes.
JV is right about the clixml being much more convenient. It does have the downside of only being useful in Powershell, and the .xml file could be orders of magnitude larger than the .csv equivalent.
I’ve done some of this sort of thing, and I’ll offer a possible alternative for the import:
if ($_.type -and ($_.value -as $_.type)){
$hash[$_.key] = $_.value -as $_.type
}
else {
write-warning “Value $($_.value) failed cast as type $($_.type) for key $($_.key). Leaving value as string.”
$hash[$_.key] = $_.value
}
That has the advantage of also checking whether the value will cast to that type before it creates the key.
As I tried to explain in my previous comment I started the whole process by thinking about external file. I like your ideas too. So there are a couple of ways to handle this task depending on your situation. Thanks for taking the time to share.
Alway good to have many different approaches.
Yes – size can be an issue with large hashes. Using a CSV would be fast and more compact. FOr an even more compace and useful approach try using an ADO Recordset saved as a Microsoft binary format ‘persisted recordset’ file. This alos allows for fast sorts and queries and is a good technique to have in your toolbox.
Jeff – I never though of using aa CSV to persist objects. It is a very useful method which I am sure I will use.
I guess I’m real old school where CSV was the easiest format to work with. Simple to create and understand. But that simplicity also means some limitations. Thanks for your comments.
It’s not just the size of the hash, but the types of objects it contains.
Try this once:
[ipaddress]’192.168.1.1′ | export-clixml temp.xml
get-content temp.xml
Not sure if I follow your thought. Is this supposed to be a problem? The Clixml format works the way I would expect.
It could have a dramatic effect on the size of the file if, for instance, you built and then exported a hash table of host names and ip addresses, and whatever you used to do get that information was returning strongly typed ip addresses.
Without a doubt XML will be larger than a corresponding CSV, but you get something valuable for the “cost”. Personally, if my hash table had simple values, I’d stick to a CSV. What really drove all of this is the new $PSDefaultParameterValues variable in PowerShell 3.0. I was looking for a way to easily import a new set of default values from a CSV file. But of course, one thing led to another and here we are.
Personally, think it depends on the circumstances. IMHO, there’s also value in the fact that .csv data is a commonly understood and supported data format among a wide variety of applications, and you give that up with the clixml file.
Ther is no need to picj a single answer as best here. If you are on-site and do not have Jeffs code or jsut need a quick snapshot of a hash that is not huge then use clixml. It is fairly fast and is very simple to use.
If you need a bigger hash then take the extra time and dump to a CSV. Both are usful and the drawbacks of both should be noted.