THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

Allen White

Pull Performance Data from Perfmon Logs with PowerShell

For the last year and a half or so I've been presenting a session on how to gather perfmon data with PowerShell. Here's a link to my blog after the 2010 Tech Ed conference that has the download scripts. These scripts include the database tables and stored procedures I use to store the data for reporting.

Sometimes when I'm at client sites I'm unable to run that script, so I'll use perfmon itself, recording the data to log files, and then copy the log files to my system, where I load the data directly into the database tables, and I've written a couple of reports that help me easily see performance trends.

Up until recently, I used the relog utility to convert the log files to csv format, then used the Import-CSV cmdlet to pull the data into PowerShell, and then loaded the tables from there. Then I stumbled on this Scripting Guys post from last month, and it gave me an idea. Rather than using relog to output a temporary csv file and them import-csv to pull it back in, I should just be able to pull in the perfmon log files.

Note, this only works in PowerShell 2.0 (and above).

In addition to the Get-Counter cmdlet, the Scripting Guys post talks about using Export-Counter. I checked and found that Import-Counter also exists, and so I figured out how to pull in the log file contents. Let's take a look.

First, I'll pass in the log file name through command-line arguments, then I'll use Import-Counter to load the contents of the log file into a variable.

param(
  [string]$logfil = $null
  )

$logcntr = import-counter $logfil

The two key properties in the counter object (technically it's the Microsoft.PowerShell.Commands.GetCounter.PerformanceCounterSampleSet object) are Timestamp and CounterSamples. I iterate through the rows in the log file using the foreach command, assigning each row to the $cntr variable, and extract the Timestamp as $dtm (which is a datetime object) and the CounterSamples collection as $clist. I then use the foreach command to iterate through the counter samples gathered at the time specified in Timestamp.

foreach ($cntr in $logcntr) {
	$dtm = $cntr.Timestamp
	$clist = $cntr.CounterSamples
	foreach ($sample in $clist) {

The members of the CounterSamples collection have three properties: Path, InstanceName and CookedValue. The Path property identifies which counter was collected. The InstanceName property identifies which instance the counter was collected for. If the counter is '% Processor Time' the instances represent the individual processors in the host system, for example. In my data collection I'm always collecting just the '_total' instance. (Your mileage may vary.) The CookedValue property contains the value collected for that counter. In my script I want to assign individual variables to each of the counters I'm collecting, so I use the switch statement within the loop, look for a wildcard value with enough detail to know which counter I've got, and then assign the appropriate variable the value from the CookedValue property.

		switch -wildcard ($sample.Path) {
			'*% Processor Time*' { $ppt = $sample.CookedValue }
			'*Available MBytes*' { $mab = $sample.CookedValue }
			'*Paging File*' { $pfu = $sample.CookedValue }
			'*Avg. Disk sec/Read*' { $drs = $sample.CookedValue }
			'*Avg. Disk sec/Write*' { $dws = $sample.CookedValue }
			'*Processor Queue Length*' { $pql = $sample.CookedValue }
			'*Forwarded Records/sec*' { $frs = $sample.CookedValue }
			'*Page Splits/sec*' { $pss = $sample.CookedValue }
			'*Buffer cache hit ratio*' { $bch = $sample.CookedValue }
			'*Page life expectancy*' { $ple = $sample.CookedValue }
			'*Log Growths*' { $log = $sample.CookedValue }
			'*Processes blocked*' { $blk = $sample.CookedValue }
			'*Batch Requests/sec*' { $brs = $sample.CookedValue }
			'*SQL Compilations/sec*' { $cmp = $sample.CookedValue }
			'*SQL Re-Compilations/sec*' { $rcm = $sample.CookedValue }
			 }
		}

Now I can use my stored procedures to store the values collected for each row, just like I do when I'm running the getperf.ps1 script.

	#Send the machine counters to our database
	$q = "declare @ServerID int; exec [Analysis].[insServerStats]"
	$q = $q + " @ServerID OUTPUT"
	$q = $q + ", @ServerNm='" + [string]$mach + "'"
	$q = $q + ", @PerfDate='" + [string]$dtm + "'"
	$q = $q + ", @PctProc=" + [string]$ppt
	$q = $q + ", @Memory=" + [string]$mab
	$q = $q + ", @PgFilUse=" + [string]$pfu
	$q = $q + ", @DskSecRd=" + [string]$drs
	$q = $q + ", @DskSecWrt=" + [string]$dws
	$q = $q + ", @ProcQueLn=" + [string]$pql
	$q = $q + "; select @ServerID as ServerID"
	$res = invoke-sqlcmd -ServerInstance $sqlsrv -Database $destdb -Query $q
	$SrvID = $res.ServerID
			
	#Send the instance counters to the database
	$q = "declare @InstanceID int; exec [Analysis].[insInstanceStats]"
	$q = $q + " @InstanceID OUTPUT"
	$q = $q + ", @ServerID=" + [string]$SrvID
	$q = $q + ", @ServerNm='" + [string]$mach + "'"
	$q = $q + ", @InstanceNm='" + [string]$srvnm + "'"
	$q = $q + ", @PerfDate='" + [string]$dtm + "'"
	$q = $q + ", @FwdRecSec=" + [string]$frs
	$q = $q + ", @PgSpltSec=" + [string]$pss
	$q = $q + ", @BufCchHit=" + [string]$bch
	$q = $q + ", @PgLifeExp=" + [string]$ple
	$q = $q + ", @LogGrwths=" + [string]$log
	$q = $q + ", @BlkProcs=" + [string]$blk
	$q = $q + ", @BatReqSec=" + [string]$brs
	$q = $q + ", @SQLCompSec=" + [string]$cmp
	$q = $q + ", @SQLRcmpSec=" + [string]$rcm
	invoke-sqlcmd -ServerInstance $sqlsrv -Database $destdb -Query $q
	}

This is a much cleaner solution that doesn't require my script to create or clean up any temporary files, and gets the performance data into the database where it's more easily examined.

Allen

Published Sunday, August 21, 2011 8:04 PM by AllenMWhite

Comment Notification

If you would like to receive an email when updates are made to this post, please register here

Subscribe to this post's comments using RSS

Comments

No Comments

Leave a Comment

(required) 
(required) 
Submit

About AllenMWhite

Allen White is a consultant and mentor for Upsearch Technology Services in Northeast Ohio. He has worked as a Database Administrator, Architect and Developer for over 30 years, supporting both the Sybase and Microsoft SQL Server platforms over that period.

This Blog

Syndication

Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement