Who’s Driving this Shell?

Microsoft has been busy with the next iteration of PowerShell. As you should already know, this version will run cross-platform. The executable, or engine, is naturally different than what you are used to with Windows PowerShell. As I was trying out the latest PowerShell beta, I needed to identify the path to the current PowerShell engine. I then thought it might be helpful to get even more details so I put together a quick PowerShell function called Get-PowerShellEngine.

Continue reading

VMDK to VHDX PDQ

I have a very old VMware ESXi server that has outlived its useful life. The hardware is at least 5 years old and my VMware license has expired. I can still bring up the server and see the virtual machines, but that’s about it. I still keep the box so I can run the PowerCLI cmdlets, at least in a limited fashion. However, there are a few virtual machines that I need to get at so I can move some applications and data to another Hyper-V virtual machine. Since I can’t get the VMware virtual machine running, I can at least convert the disk to a VHDX and bring it up in a new Hyper-V virtual machine.

I already have PowerCLI 6.0 and the Hyper-V cmdlets installed on my computer. My initial thought was to use the free Microsoft Virtual Machine Converter which you can download. But I ran into a number of issues using the GUI, primarily because not everything is in the same domain. But that doesn’t really matter because I didn’t really want to migrate the complete virtual machine, I just needed to bring up the old VM temporarily so I could migrate things off. Fortunately, there are a set of cmdlets that ship with Microsoft’s converter. Here’s what I did.

First, I needed to connect to my ESXi box using the PowerCLI cmdlets.

I need to get the disk files for a given virtual machine. One benefit of PowerCLI is that you can easily browse the datastore files.

I’ll need all of these files. But first I need to be able to access them from the file system. Fortunately, PowerCLI has a handly cmdlet for copying items from a datastore to the file system.

Once the files are copied I can begin the conversion. However, the Microsoft virtual machine converter cmdlets aren’t installed in an expected location so I’ll have to manually import them.

Once imported, I can use ConvertTo-MvmcVirtualHardDisk.

The conversion took about 20 minutes for a 40GB file. The converted file was 30GB for a dynamic hard disk. With the conversion complete, it is pretty easy to fire up a new Hyper-V virtual machine.

You’ll notice that I am using a complete path for the New-VM cmdlet. That’s because in my session I have both PowerCLI and Hyper-V cmdlets and they both have a New-VM cmdlet. Normally I wouldn’t have both running in the same session but since I do, I need to explicitly tell PowerShell which cmdlet to use.

And that’s about it. Once running I uninstalled the VMware Tools, installed the Hyper-V Integration Toolkit and let Windows detect everything else. This would require me re-activating Windows, but I’m hoping to migrate everything I need before that becomes an issue.

 

Copy and Mount a CD with PowerCLI

The other day I realized I needed to rebuild my SQL Server 2012 installation which I’m running on a virtual machine running on an ESX box. Given that I have PowerCLI and I like to do things from the command prompt when I can, I decided to mount the SQL Server 2012 ISO on the VM using PowerShell. This actually requires a few steps that I thought I would share.

First, you naturally need to have PowerCLI installed. You will need to import the necessary snapins and connect to the ESX host.


Add-PSSnapin VMware.VimAutomation.core
Connect-viserver -Server ESX

Once connected, I can begin the process. Next, I need to copy the ISO file the datastore so that I can mount it from the VM. While the vmstore: PSDrive is fun to work with, you can’t copy from the filesystem. Copying between providers is simply not allowed. Instead I’ll use the Copy-DatastoreItem cmdlet. This will allow me to copy a local file to the VMware datastore.

The tricky part here is to get the right format for the datastore destination. This is where I want to copy to:


PS vmstore:\ha-datacenter\datastore3\ISO> dir

Datastore path: [datastore3] ISO

LastWriteTime Type Length Name
------------- ---- ------ ----
8/18/2009 9:09 AM IsoImageFile 2996799488 en_windows_server...

The “trick” is to grab the value for Datastore path from the directory listing. Thus, I can run this to copy the ISO file to the datastore.


$iso="C:\users\jeff\Downloads\en_sql_server_2012_standard_edition_with_sp1_x86_x64_dvd_1228143.iso"
$dest="vmstore:\ha-datacenter\datastore3\ISO"
Copy-DatastoreItem -Item $iso -Destination $dest -passthru

You get a nice progress bar and in a few minutes the file is copied. Now I can mount it in the VM’s CDDrive using Set-CDDrive. The cmdlet will need a CDDrive object from a VM and the path to the ISO file. I’ll have to construct a path using the VMware datastore format. It looks a little funny because it is not a typical Windows path.


$isopath = "[datastore3] ISO\en_sql_server_2012_standard_edition_with_sp1_x86_x64_dvd_1228143.iso"

For situations like this, I find it easiest to use the corresponding Get cmdlet, and pipe the resulting object to the Set cmdlet.


get-cddrive -VM "globomantics db" | set-cddrive -IsoPath $isopath -Connected $true

The only other parameter I specified was to connect the CDDrive to the VM. This command takes a moment to run and then in the VM I can “see” the DVD and use it normally. Awesome. When I’m finished I can dismount the CD much the same way.


get-cddrive -VM "globomantics db" | set-cddrive -NoMedia

This may seem like a lot of typing, but if it is something I need to do a lot I could build a simple script or function. And it is still faster (for me) than navigating the GUI.

Pipeline Power

Last week I came across a blog post that had a decent example using PowerShell and PowerCLI to get the disk location for all virtual machines. The posted code works and does display the information you might be after.


$myVMs = get-vm

foreach($vm in $myVMs){
$myDisks = @($vm | get-harddisk)
foreach ($disk in $myDisks) {
write-host $vm.Name, ( $disk|select -ExpandProperty Filename)
}
}

But I saw an teaching opportunity. Because the code works I can’t say it is “wrong”, but it really doesn’t adopt the PowerShell paradigm. The first issue is that using Write-Host only writes content to the console. There is no way with this command to do anything else with the results such as sorting, grouping or sending to a text file.

The other issue is the need to use ForEach. This is what we had to do in VBScript but in PowerShell we can take advantage of the pipeline.


get-vm | Select Name,@{Name="Disk";Expression= {$_ | get-harddisk | Select -ExpandProperty Filename }}

But now I can do something with this such as sorting by disk:


PS S:\> get-vm | Select Name,@{Name="Disk";Expression= {$_ | get-harddisk | Select -ExpandProperty Filename }} | sort Disk,Name

Name Disk
---- ----
Cluster Alpha {[datastore1] Cluster Alpha/Cluster ...
Globomantics Mail [datastore1] globomantics mail/Win2K...
MyCompany Exchange 2007 {[datastore1] MyCompany Exchange 200...
MyCompany XP {[datastore1] MyCompany XP/MyCompany...
MyCompany Windows 2008 [datastore1] MyCompany2008/Windows S...
MyCompanyDC 2K3R2 {[datastore1] MyCompanyDC 2K3R2/MyCo...
R2 Server Core -DEL [datastore1] Research Core DC/R2 Ser...
Cluster Bravo {[datastore2] Cluster Bravo/Cluster ...
MyCompany Vista {[datastore2] MyCompany Vista/Vista ...
...

Or if there are multiple disks, it is much easier to work with them. Write-Host can’t.


PS S:\> $vminfo=get-vm | Select Name,@{Name="Disk";Expression= {$_ | get-harddisk | Select -ExpandProperty Filename }}
PS S:\> $vminfo[1].disk
[datastore2] MyCompany Vista/Vista Baseline.vmdk
[datastore2] MyCompany Vista/MyCompany Vista.vmdk
PS S:\> $vminfo | Export-Clixml c:\work\vminfo.xml

The tricky part here I realize is pulling up a value from a nested object, in this case the Filename and adding it to the VM object. I totally get that this is not something a beginner would necessarily discover on their own, which is why I write stuff like this. But the big difference is that I know have an object written to the pipeline that I can do something with and I didn’t have to resort to keep track of what goes where in some foreach loops.

The other advantage, although not universal, is performance. Running the ForEach code against my 23 VMs took almost 6 seconds. My PowerShell one line took a tad over 3 seconds.

I don’t want you to think you can never use Write-Host or ForEach. Sometimes they make sense and may even perform better. But always ask yourself if you are thinking the PowerShell way and pushing objects through the pipeline or are you writing something that could be mistaken for VBScript.

By the way, I have posted most of this on the blog as a comment that is awaiting moderation. But I figured I would share it with my readers as well.

Create a Master PowerShell Online Help Page

As I hope you know, PowerShell cmdlets can include links to online help. This is very handy because it is much easier to keep online help up to date. To see online help for a cmdlet use the -online parameter.

I decided to take things to another level and create an HTML page with links to online help

I created a relatively simple script called New-OnlineHelpPage.ps1. The script uses Get-Command to retrieve a cmdlet name, it’s module or snapin and it’s help link.

This expression filters out cmdlets without an online link. By default the script sorts by cmdlet name but if you use the -SortModule parameter, it will sort by module or snapin. Get-Command treats them as the same. The data is piped to ConvertTo-HTML to create the HTML report.

The function lets you specify a CSS file path which is stored in $CSS and I include a sample one in the download below. But now for the fun part. The help url looks like a link but it isn’t actionable. Remember that ConvertTo-HTML doesn’t create a file, it creates HTML which means I can intercept it and use a regular expression to find the URL and replace it with HTML code that turns it into a link. At the end of the process I finally pipe the HTML to Out-File to create the report.

The last step is to launch the HTML page using Invoke-Item. This will launch the file with whatever application is associated with the file extension you specified.

The script takes a few parameters.

The file it creates is stored in you TEMP folder by default. The script will process any cmdlet loaded in your PowerShell session so you can add snapins like PowerCLI or modules like ActiveDirectory and get those links as well. What you end up with is a single page with links to all the online help.

Even if this isn’t of use, I hope you picked up a little knowledge about Get-Command and ConvertTo-HTML. You can download a zip file with my script and a sample CSS file here.