Get Local Admin Group Members in a New Old Way

Yesterday I posted a quick article on getting the age of the local administrator account password. It seemed appropropriate to follow up on a quick and dirty way to list all members of the local administrator group. Normally, I would turn to WMI (and have written about this in the past). But WMI is relatively slow for this task and even using the new CIM cmdlets in PowerShell 3.0 don’t improve performance. Instead I’m going to return to an old school technique using the NET command.

It is very easy to see members. To query a remote computer all I need to do is wrap this in Invoke-Command and use PowerShell remoting.

Yes, there is some overhead for remoting but overall performance is pretty decent. And if you already have an established PSSession, even better. For quick and dirty one-liner it doesn’t get much better. Well, maybe it can.

I have no problem using legacy tools when they still get the job done and this certainly qualifies. To make it more PowerShell friendly though, let’s clean up the output by filtering out blanks, that last line and skipping the “header” lines.

Boom. Now I only get the member names. Let’s go one more level and write an object to the pipeline and be better at handling output from multiple computers. I came up with a scriptblock like this:

This will create a simple object with a properties for the computername, group name and members. Here’s how I can use it with Invoke-Command.

get-netlocalgroupNow I have objects that I can export to XML, convert to HTML or send to a file. But since I’ve come this far, I might as well take a few more minutes and turn this into a reusable tool.

This function lets me specify a group of computers or PSSessions as well as the local group name. Today I may need to know who belongs to the local administrator’s group but tomorrow it might be Remote Desktop Users.

Sometimes even old school tools can still be a part of your admin toolkit.

 

Find Files with PowerShell 3.0

My last few articles have looked at using WMI and CIM_DATAFILE class to find files, primarily using Get-WmiObject in PowerShell. But now that we have PowerShell 3.0 at our disposal, we can use the new CIM cmdlets. So I took my most recent version of Get-CIMFile and revised it specifically to use Get-CimInstance. I also took the liberty of adding a few more refinements, some of which you could integrate into previous versions. Here’s the new v3 function.

This version let’s you search remote computers by name or CIM session. Because they are mutually exclusive options, this function uses parameter sets, defaulting to using a computername. I also added a validation check on the drive name using regular expressions. The function will fail if the value is not a letter followed by a colon.

Another major change was modifying code to search for filenames without an extension. What if you are looking for a file like README? The WQL query turned out to be more complicated than I imagined. It would be easier if the extension property was NULL, but it isn’t. It is a 0 length string. I found that in order to make this work, I needed to create a query like this:

So I modified my code to adjust operators and variables that I use to build the filter string.

Now I can find files without an extension, or with.

The last major change you’ll notice is that I build a hash table of parameter values and then splat it.

This is a terrific technique when you are dynamically generating parameters.

Get-CimInstance can be used to query remote computers, assuming they are also running PowerShell 3.0. However, you can also use CIM Sessions which allow you to establish a connection to an older system using the DCOM protocol.

The end result is that I can still use my PowerShell 3.0 function to query a PowerShell 2.0 machine as long as I have a pre-created session.

get-cimfile3

Now I have a very powerful tool that can search just about any computer in my domain.

Oh, one more thing I realized in working on this. Initially I was only paying attention to the file name and version. Then I noticed that the Size property in the default output was always empty. That struck me as odd and not very useful. So I looked at the actual object with Get-Member and it turns out there is a FileSize property which is populated. It looks like the default property set for CIM_DATAFILE uses the Size property, when it should really be FileSize. So keep that in mind as you are working with the results.

Download Get-CIMFile3 and try it out for yourself.

PowerShell Hyper-V Memory Report

Since moving to Windows 8, I’ve continued exploring all the possibilities around Hyper-V on the client, especially using PowerShell. Because I’m trying to run as many virtual machines on my laptop as I can, memory considerations are paramount as I only have 8GB to work with. Actually less since I still have to run Windows 8!

Anyway, I need to be able to see how much memory my virtual machines are using. The Get-VM cmdlet can show me some of the data.

Actually, there are more properties I can get as well.

Those values are in bytes so I would need to reformat them to get them into something more meaningful like bytes. Not especially difficult, but not something I want to have to type all the time. Now, I can also get memory information with Get-VMMemory and this is formatted a little nicer.

What I like about this cmdlet is that it also shows the buffer and priority settings.

In the end, I decided the best course of action was to build my own function that combined information from both cmdlets. The result is a custom object that gives me a good picture of memory configuration and current use. The function, Get-VMMemoryReport, is part of a larger HyperV toolkit module I’m developing but I thought I’d share this with you now.

I wrote the function with the assumption of piping Hyper-V virtual machines to it. Although I can also pipe names to it and the function will then get the virtual machine.

Once the function has the virtual machine object, it also gets data from Get-VMMemory.

Finally, it creates a hash table using the new [ordered] attribute so that the key names will be displayed in the order I enter them. I use this hash table to write a custom object to the pipeline. I could have used the new [pscustomobject] attribute as well, but I felt in a script using New-Object was a bit more meaningful. With this command, I get output like this:

Or I can explore the data in other ways. I can create an HTML report, export to a CSV or take advantage of Out-GridView.

Here’s the report for my currently running virtual machines.

The function defaults to connecting to the localhost, but I am assuming that if you have an Hyper-V server you could use this from any system that has they Hyper-V module also installed. I don’t have a dedicated Hyper-V server to test with so maybe someone will confirm this for me.

In the meantime, download Get-VMMemoryReport and let me know what you think.

Get My Variable Revisited

Last year I wrote a few articles on working with variables. One task I needed was to identify the variables that I had created in a given PowerShell session with a function I wrote called Get-MyVariable. I also posted an article on identifying the object type for a variable value. While trying to find something today I realized I could combine the two ideas.

One approach would be to leave my original function alone. When I need the variable value type, I could simply to this:


PS C:\> get-myvariable | Select Name,Value,@{Name="Type";Expression={$_.Value.GetType().Name}}

Name Value Type
---- ----- ----
a 5/28/2012 8:45:25 AM DateTime
bigp ps | where {$_.ws -gt 1... ScriptBlock
cert [Subject]... X509Certificate2
dirt Param([string]$Path=$en... ScriptBlock
...

See the value in having a function that writes to the pipeline? However, I wanted this to be the default behavior so I decided to incorporate it into my original function. I also realized that there may be situations where I don’t want this information so I added a -NoTypeInformation switch parameter. The changes to my original function were minimal.


#filter out some automatic variables
$filtered=$variables | Where {$psvariables -notcontains $_.name -AND $_.name -notmatch $skip}

if ($NoTypeInformation) {
#write results with not object types
$filtered
}
else {
#add type information for each variable
Write-Verbose "Adding value type"
$filtered | Select-Object Name,Value,@{Name="Type";Expression={$_.Value.GetType().Name}}
}

Now, I can run a command like this:


PS S:\> get-myvariable | where {$_.type -eq "Scriptblock"} | Select name

Name
----
bigp
dc01
dirt
disk
doy
run
up

I suppose I could further refine the function to do filtering in place for a specific type. But I’ll leave that exercise to you.

Download Get-MyVariable2 and try it out for yourself.

Friday Fun: Get Latest PowerShell Scripts

Probably like many of you I keep almost all of my scripts in a single location. I’m also usually working on multiple items at the same time. Some times I have difficult remembering the name of a script I might have been working on a few days ago that I need to return to. The concept is simple enough: search my script directory for PowerShell files sorted by the last write time and look for the file I need.

That’s a lot to type so why not build a function to do the work for me? In fact, since I spend a lot of time in the PowerShell ISE, why not produce graphical output? The easiest way is to pipe results to Out-Gridview. So after a little tinkering, I came up with Get-LatestScript.


Function Get-LatestScript {

[cmdletbinding()]

Param(
[Parameter(Position=0)]
[ValidateScript({Test-Path $_})]
[string]$Path=$global:ScriptPath,
[Parameter(Position=1)]
[ValidateScript({$_ -ge 1})]
[int]$Newest=10
)

if (-Not $path) {
$Path=(Get-Location).Path
}

#define a list of file extensions
$include="*.ps1","*.psd1","*.psm1","*.ps1xml"

Write-Verbose ("Getting {0} PowerShell files from {1}" -f $newest,$path)

#construct a title for Out-GridView
$Title=("Recent PowerShell Files in {0}" -f $path.ToUpper())

Get-ChildItem -Path $Path -Include $include -Recurse |
Sort-Object -Property lastWriteTime -Descending |
Select-Object -First $newest -Property LastWriteTime,CreationTime,
@{Name="Size";Expression={$_.length}},
@{Name="Lines";Expression={(Get-Content $_.Fullname | Measure-object -line).Lines}},
Directory,Name,FullName |
Out-Gridview -Title $Title

}

I decided to try something different with the Path variable. I set the default to a global variable, ScriptPath. The idea is that in your PowerShell profile, you’ll have a line like this:


$ScriptPath="C:\Scripts"

If the function finds this variable, it will use it. Otherwise it will use the current location. Notice I’m also using a validation attribute to verify the path. By default the function returns the 10 newest PowerShell files, based on the last write time. The number of files can be controlled by the -Newest parameter.

In the heart of the script is a one-line expression to find all matching files in the script folder and subfolders.


Get-ChildItem -Path $Path -Include $include -Recurse |
Sort-Object -Property lastWriteTime -Descending

These files are then sorted on the LastWriteTime in descending order and then I only select the first $newest number of files.


| Sort-Object -Property lastWriteTime -Descending |
Select-Object -First $newest ...

I am only interested in a few file properties so I select them. I also add a custom property to measure the file and get the number of lines in the script.


...-Property LastWriteTime,CreationTime,
@{Name="Size";Expression={$_.length}},
@{Name="Lines";Expression={(Get-Content $_.Fullname | Measure-object -line).Lines}},
Directory,Name,FullName

Finally the results are piped to Out-Gridview.


... | Out-Gridview -Title $Title

For now, I have to manually open the file. Perhaps I’ll create a WinForm or use ShowUI to integrate it into the PowerShell ISE.

You can download Get-LatestScript which includes comment based help.