THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

Merrill Aldrich

Quick and Dirty PowerShell SQL Server Load Test

Today I wanted to share a very short PowerShell script that can be used to simulate running multiple clients against a SQL Server, for simple performance or hardware testing. One thing I love about PowerShell is that so many functions and features are pre-built and easily accessible that I have to write very little code to accomplish a lot. This script uses the background job feature to run asynchronous processes, and the SQL features to invoke scripts stored in files, and I didn’t have to engineer any of that, which is cool. It is out-of-the-box PowerShell 2 and SQL Server 2012 at the client, but would work against basically any version of SQL Server at the server side. Not bleeding edge at all.

# Start  clients running the same SQL script simultaneously

$numWorkers = 10

$startTime = ( Get-Date ).AddMinutes(1)

$jobscript = {
    param( $startTime )
         
    while ( ( Get-Date ) -lt $startTime ) { Start-Sleep -Milliseconds 100 }
    
    Invoke-Sqlcmd -InputFile 'path to your\sqlscript.sql' -ServerInstance yourSQLServer -Database yourDB -QueryTimeout 0  
}

(1..$numWorkers) | foreach {
    Start-Job $jobscript -ArgumentList $startTime -InitializationScript { Import-Module "sqlps" -DisableNameChecking }
}

The script does just one thing: given a SQL script stored in a file, it will start some number of simultaneous connections running the SQL against a server. Let’s break it down a bit to see how.

First we establish the number of client connections to use, and identify a single start time for them to begin running, one minute from the current time. This extra minute will give all the threads time to get created/configured so that later we can cause them to start at approximately the same moment:

$numWorkers = 10

$startTime = ( Get-Date ).AddMinutes(1)

Next, we need a “mini PowerShell script” inside this script to pass to an async job to run. This “mini script” is actually what the multiple worker threads execute. There are a few ways to make this, but a simple way to handle it is just to assign a code block { } to a $variable:

$jobscript = { work for the async job goes here }

The background job feature has some behavior that is not completely intuitive – for one thing, variables in your calling script won’t be accessible from a running job, because the job runs in a completely separate session/environment. For that reason, you have to explicitly pass in any values that you need in the code that the job runs. This is the reason for the first line inside the job script:

param( $startTime )

Using this parameter, we can pass the single start time value from the outer script to all the background jobs. Without it the jobs could not “see” the value.

Next, the body of the job script is simply “wait for the start time, then execute this SQL script”:

    while ( ( Get-Date ) -lt $startTime ) { Start-Sleep -Milliseconds 100 }
    
    Invoke-Sqlcmd -InputFile 'path to your\sqlscript.sql' -ServerInstance yourSQLServer -Database yourDB -QueryTimeout 0  

Note: Invoke-Sqlcmd is not loved by everyone, and it has some limitations, but it’s there and mostly works. If you are on pre-2012 SQL client tools, beware the bug for –QueryTimeout 0 and long-running queries.

That is all the the code we need to run within the background jobs.

Lastly, we need to spin those background jobs up. This funny thing generates a list of numbers, 1 to 10:

(1..$numWorkers)
That list is piped to a foreach loop, which just drives the loop to run 10 times. For each iteration of the loop, we create a background job set to run the code in $jobscript:
    Start-Job $jobscript -ArgumentList $startTime -InitializationScript { Import-Module "sqlps" -DisableNameChecking }

This statement again shows some of special behavior of background jobs – first, you have to explicitly pass in any values using the –ArgumentList parameter, because the scope where the job runs cannot “see” your other code or variables. This is how to get the startTime value into all the jobs.

Second, the background job won’t have your PowerShell profile. In my profile, I load the sqlps module, which is where Invoke-Sqlcmd lives. Since my profile doesn’t load for the job, the job would not have that module available, and would not know what “Invoke-Sqlcmd” is. Fortunately, the Start-Job cmdlet has an –InitializationScript option that allows you to pass a second code block that runs before the real work in the job. This is a good place to set up or configure the job, in this case loading the required module.

The last item you’ll need for this is the actual SQL script that has your testing logic, like running resource-intensive queries in a loop. I will leave that as an exercise for the reader.

Once the jobs are running you can obviously watch PerMon or use other diagnostics to see how the subject server is handling the load. Back in PowerShell, you can use the cmdlets Get-Job, Receive-Job, Stop-Job and Remove-Job to manage or view the jobs created above.

I used this to test a new server yesterday with a PCI flash storage card and drove it to 163,000 real, SQL Server, random read IOs/sec, which was pretty cool. Happy load testing!

Published Friday, July 12, 2013 8:17 AM by merrillaldrich

Comment Notification

If you would like to receive an email when updates are made to this post, please register here

Subscribe to this post's comments using RSS

Comments

No Comments

Leave a Comment

(required) 
(required) 
Submit

This Blog

Syndication

Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement