THE SQL Server Blog Spot on the Web

Welcome to - The SQL Server blog spot on the web Sign in | |
in Search

Allen White

  • 24 Hours of PASS - PowerShell 101 for the SQL Server DBA

    Thanks to everyone who came out for the session today.  It was a lot of fun and I hope you had a great experience as well.  I've attached the demo scripts and slide deck to this post, and I look forward to seeing you in November at the PASS Summit!


  • Speaking - 24 Hours of PASS, Summit Preview Edition

    There's so much to learn to be effective with SQL Server, and you have an opportunity to immerse yourselves in 24 hours of free technical training this week from PASS, via the 24 Hours of PASS event.

    I'll be presenting an introductory session on PowerShell called PowerShell 101 for the SQL Server DBA. Here's the abstract:

    The more you have to manage, the more likely you'll want to automate your processes. PowerShell is the scripting language that will make you truly effective at managing lots of servers. But it's more than just a scripting language - it's an interactive shell that stores data for you and allows you to implement ad-hoc solutions quickly and easily. Within the PowerShell environment you can easily manage both SQL Server instances and the Windows servers themselves, giving you a 'best of both worlds' environment that puts you in control. This session will introduce you to PowerShell and show you how to use it to manage SQL Server across many instances.

    I look forward to seeing you there.


  • Speaking - Red Gate's SQL in the City

    The great folks at Red Gate have invited me to join the festivities at the SQL in the City events in both Chicago on October 5, and in Seattle on November 5.

    In both cities I'll be presenting a session entitled Automated Deployment: Application And Database Releases Without The Headache. Here's the abstract:

    Ever since applications were first created, the deployment of updates and changes has been a headache, with the potential of disruption of the application at best and data corruption at worst. Getting the steps of any deployment right are critical to the success, and if there's a problem it's even harder. Data structures depend on code, and code depends on data structures. In this session we'll look at how deployment has been done in the past, the problems encountered, and we'll look at some ways to mitigate the risk inherent in application deployment, including Red Gate's new Deployment Manager tool, which helps you streamline the whole process.

    There is a great lineup of speakers at all the events, including my good friends Steve Jones (b | t) and Grant Fritchey (b | t), who both work for Red Gate, but also people like Brent Ozar (b | t), Jes Borland (b | t), Buck Woody (b | t) and Adam Machanic (b | t), all good friends and great speakers.

    You can sign up for the Chicago event here and for the Seattle event here. I look forward to seeing you!


  • T-SQL Tuesday #34: Help me, Obi-Wan Kenobi, You're My Only Hope!

    T-SQL TuesdayThis T-SQL Tuesday is about a person that helped you understand SQL Server.

    It's not a stretch to say that it's people that help you get to where you are in life, and Rob Volk (@sql_r) is sponsoring this month's T-SQL Tuesday asking who is that person that helped you get there.

    Over the years, there've been a number of people who've helped me, but one person stands out above the rest, who was patient, kind and always explained the details in a way that just made sense.

    I first met Don Vilen at Tech Ed 2006 in Boston. I'd seen him present at the PASS Summit a couple of times, but at that Tech Ed I'd started to get to know people and felt comfortable just hanging out in the SQL Server area, and we started talking. He was easy going, and in talking about issues at work he explained how things worked in a way that no one else had (at least to me). As a side note, it was at that event that he introduced two of his friends to each other: Kimberly Tripp and Paul Randal.

    At subsequent conferences I made a point to see his presentations, no matter the subject. His explanation of how database snapshots work was the clearest I've ever seen, and used in many classes I taught since. I always enjoyed his company and conversation with him was lively and entertaining, regardless of the subject. I was sad to find out shortly after I received my first MVP Award in July 2007 that he was leaving Microsoft. He joined Scalability Experts as a Principal Architect that fall.

    In July 2008, through a strange set of circumstances, I joined Scalability Experts myself, as a trainer. Since there were more consulting engagements than training assignments at first, I was assigned to a project, and the architect assigned to the project was none other than Don. He mapped out what we had to do and explained some aspects of SQL Trace I'd never used, and I was able to create a significant load test out of a relatively small trace file with his help.

    The following spring another assignment put me together with Don evaluating different server hardware using the same workload, and with Don's help I was able to get a much better understanding of baselines and performance metrics, and I still use that knowledge in client engagements and in my presentations today.

    The last time we worked together was for a major financial services company in New York City, and Don led a team of us to put together a Center of Excellence plan for the company, to help their staff become the best in the business with respect to their use of SQL Server. My focus was on the Database Developers, and with Don's help I built presentations and documentation on all aspects of developing fast and efficient queries and processes to run against SQL Server. The environment was both high profile and highly stressful, and Don always kept his cool and was always helpful.

    After I left Scalability, my daughter and son-in-law were flying back to the US from Germany, where my son-in-law was stationed in the Army, when a blizzard hit the east coast and they shut down the airports after their plane landed at JFK. Because of the conditions there were no hotels, and the next flight the kids could get out wasn't until four days later. I called Don, who had an apartment not far from Central Park, and he was kind enough to put the kids up for the night. I drove to New York and picked them up there and brought them home the next day.

    There've been great people who've helped me before and since, but no one stands out like Don. Thank you, Don, for everything.


  • Your Presence Matters at the PASS Summit

    This year will be my tenth year attending the annual PASS Summit. It's in Seattle again this year, which to me is important because of the proximity to the Microsoft offices and the people on the SQL Server dev team, as well as the CX (formerly CAT) teams that help so many people get the most out of SQL Server.

    The conference is the biggest event in the world specifically focused on SQL Server. As a result, it's an opportunity to meet many people who are directly focused on the SQL Server platform, and those individuals have created a unique community around PASS and the PASS Summit. It's within that community where the relationships develop that help you solve the problems you encounter every day when working with SQL Server. The key word here is relationships. Technology is always changing, but those relationships will help you navigate that changing technology, because those people are going through the same things you are.

    It's not just attendees that benefit from those relationships, though. I've been fortunate to have been selected to speak at the Summit each year since 2006. As I've gotten to know the other speakers those acquaintances have become friendships, and some of those friendships have flourished, becoming personal friendships, which go far beyond SQL Server. I look forward to the big events because I get to see my friends again, and we all get to hang together and do some crazy things like Karaoke.

    More importantly, though, is that I get the opportunity to learn from others. No one person knows it all. It's not possible; SQL Server is just too big. Even with topics that I know really well, I’ve discovered that there's always some new aspect that I have never encountered, that I learn from others. For example, when I was at SQL Rally in Dallas I attended a basic SQL Server backup session given by Sean McCown. I've been doing SQL Server backups for 20 years, but I learned something in that session. No matter who you are, and how much you know, you can learn something from others.

    For those reasons it's important for you to be there. You'll learn something from the people who are there. We'll learn something from you. We'll get to spend some time together, and maybe get to know each other a little better. We're all better off for it, and this is how we build our community.

    So no matter if you're a brand, spanking new DBA, or the person who wrote the query processor, please make a point to come to the PASS Summit. Let's spend some time together, have a little fun, learn something, and mostly, make the community a little bit stronger.

    I look forward to seeing you there.


  • SQLCruise Alaska was Amazing

    You'd think that providing in-depth SQL Server training on a cruise ship would be an excuse for a vacation disguised as a business trip, but you'd be wrong. This past week I traveled with the founders of SQLCruise, Tim Ford and Brent Ozar, along with other top professionals in the SQL Server world - Jeremiah Peschka, Kendra Little, Kevin Kline and Robert Davis - and me.

    The week began with Brent presenting a session on Plan Cache Analysis, which I plan to start using very soon. After Brent, Kevin presented a four-hour leadership training session. He's recorded a set of training videos which will really help IT professionals make the transition to management.

    After spending the day snorkeling in Ketchikan, Alaska with Snorkel Alaska, I presented a short session on SQL Sentry's Plan Explorer and announced Wayne Sheffield as the winner of the Power Suite from SQL Sentry for having the "hairiest execution plan". SQL Sentry paid for my cruise and travel as their representative, and I'm really grateful to them for the opportunity. They have a great suite of products, and they're great people, and that makes me proud to recommend them, as well as to represent them. Following the contest, Jeremiah did an interesting session called "Killer Techniques for Database Performance" where he went into really great detail about the real cost of accessing data in applications.

    Tuesday was mostly a sightseeing day, as we spent the day in Juneau, where I did a tour called "Glacier View Bike and Brew", put together by Cycle Alaska. A van picked us up and took us north to the University of Alaska Southeast on Auke Lake, and we rode nine miles on a bike to Mendenhall Lake where we got great views of Mendenhall Glacier, then on to the glacier visitor center. Then they drove us into Juneau to a pub where we had a beer tasting of the different blends from the Alaskan Brewing Company. We reboarded the ship and cruised out to Sawyer Glacier to see that spectacular site, and on the way participated in a storytelling session in Brent's suite. Some great stories were told, and Darcy Williams won an Amazon Kindle Fire for the best IT story told.

    Wednesday was spent in Skagway, where Tim arranged a van tour up through the White Pass which the gold rush prospectors had to traverse to get to the fields near Dawson City in the Yukon Territory. We followed that up with lunch and beverages at the Skagway Brewing Company, or "BrewCo", as they call it locally.

    After taking advantage of sightseeing primarily for three days, we got back into the serious training on Thursday, where I did a shortened version of my PowerShell pre-con. It's hard to get a real comprehensive "PowerShell for SQL Server" session in four hours, but I think I covered it well enough to get people familiar enough to get real work done, and showed some advanced features to show them what they can do once they get up to speed. After my session, Kendra did a session called "Sports Medicine For High Transaction Databases" where she talked about benchmarking and methods to help your high volume transactional systems to perform under heavy load. The day ended with Robert doing a session on "Recovering Data and Databases" where he talked about how data is stored, how it's backed up, and how to recover from corruption.

    The last day of the cruise started with Brent, this time talking about Scaling SQL with Solid State, where I learned that even putting solid state drives in place may not solve I/O subsystem performance problems, and how to plan for the best performance using them. We followed that up with the Idera Iditarod Challenge, where the cruisers had to answer SQL questions and perform additional challenges, and winners Malathi Mahadevan and Max Gomeniouk each won an Amazon Kindle Fire from Idera. We ended the day visiting Victoria, BC.

    The best part of the cruise, aside from the great training, was getting to know the cruisers and spending real time with them. It's often hard when working as a database professional because we seldom work in a team, and it can be difficult to know what questions to ask when presented with a problem. This training is unique because we spend time together, not just in class, but visiting the sights, and talking about SQL. After a week together we all have new people we can go to with questions and ideas, to see if they're valid or to get problems solved. I'm truly impressed with the work that Tim does in making the cruise a success for everyone, and I hope you'll consider signing up for one of next year's cruises. Check it out at


  • SQL Rally Presentations

    As I drove to Dallas for this year's SQL Rally conference (yes, I like to drive) I got a call asking if I could step in for another presenter who had to cancel at the last minute.  Life happens, and it's best to be flexible, and I said sure, I can do that. Which presentation would you like me to do? (I'd submitted a few presentations, so it wasn't a problem.)

    So yesterday I presented "Gathering Performance Metrics With PowerShell" at 8:45AM, and my newest presentation, "Manage SQL Server 2012 on Windows Server Core with PowerShell" at 1PM at the conference here in Dallas.  The room was packed for both sessions and I truly appreciate every one of you who attended and there were great questions in both sessions.

     I've attached a zip file containing the slide decks from both presentations along with the demo scripts.  I hope you get as much out of using PowerShell with SQL Server as I have, and Thanks!


  • Script and Migrate Agent Jobs between Servers using PowerShell

    I'm nearing the end of a SQL Server upgrade project, migrating from SQL Server 2000 to SQL Server 2008 R2. The company uses SQL Server Agent to run about 150 jobs which execute DTS packages to do the work of moving data around. Now, there are any number of ways to move these jobs from one server to the other, but I wanted to be able to exclude certain jobs, make some minor changes before loading them onto the new server, that kind of thing. PowerShell makes that relatively easy to do.

    First, most of the jobs were owned by some login, many of which represent people no longer with the company. This happens over time, but should be cleaned up. To quickly modify the jobs to all be owned by 'sa', it's just a few lines of code:

    # Connect to the instance using SMO
    $s = new-object ('Microsoft.SqlServer.Management.Smo.Server') $inst
    # Iterate through the jobs to reset the job owner
    foreach ($job in $s.JobServer.Jobs) {
    	if ($job.Owner -ne 'sa') {
    		$job.OwnerLoginName = 'sa'

    My scripts usually pass in $inst as the SQL Server instance name as a command-line argument, so I can use the script on any server. Once this script was done all the jobs are owned by sa, and I don't have to worry about enabling old logins on the new server. Next thing is to script out the jobs into .sql files, one for each job. In this case I pass in a base directory path as a second command-line argument. Then I use PowerShell to create the Jobs directory under that (note that if you're using a named instance, I'd recommend using the encode-sqlname cmdlet to turn the $inst variable into an encoded name to prevent the backslash from creating problems):

    # Create the Database root directory if it doesn't exist
    if (!(Test-Path -path "$base\$inst\Jobs\"))
    	New-Item "$base\$inst\Jobs\" -type directory | out-null
    $homedir = "$base\$inst\Jobs\"

    The next step is to set the properties for the script files:

    # Instantiate the Scripter object and set the base properties
    $scrp = new-object ('Microsoft.SqlServer.Management.Smo.Scripter') ($s)
    $scrp.Options.ScriptDrops = $False
    $scrp.Options.WithDependencies = $False
    $scrp.Options.IncludeHeaders = $True
    $scrp.Options.AppendToFile = $False
    $scrp.Options.ToFileOnly = $True
    $scrp.Options.ClusteredIndexes = $True
    $scrp.Options.DriAll = $True
    $scrp.Options.Indexes = $True
    $scrp.Options.Triggers = $True

    Then we can cycle through the jobs and create scripts for each job on the server.

    # Create the script file for each job
    foreach ($job in $s.JobServer.Jobs) {
    	$jobname = $job.Name
    	$scrp.Options.FileName = "$homedir\$jobname.sql"

    There are some jobs on the original server that won't move - things like maintenance plan jobs, which I'm replacing with more efficient methods, so having individual script files allows me to simply delete the script files for the jobs not moving to the new server. Once I've done that I have another script which will load the script file into a PowerShell variable, and using invoke-sqlcmd, create the job on the new server. Now, each script file ends with a 'GO' statement, and invoke-sqlcmd doesn't like that, so I want to strip it out, which is easy enough to do with the Replace method. The problem is that it also contains a GOTO command to bypass the error notification. I use the Replace() method to get around that by first replacing GOTO with OTOG, then replacing GO with an empty string, then replacing OTOG again with GOTO.

    The other 'tricky' thing I had to do was to convert the script file to a PowerShell string. I use the Get-Content cmdlet to pull the script into a PowerShell object. I then use the here-string construct with the PowerShell object inside the here-string, and that converts it from an object to a System.String object for me, and then I can easily supply that string as the query for invoke-sqlcmd.

    # Get all the files with a .sql extension in the directory and cycle through them
    $files = get-childitem $basedir | where {$_.Name -like '*.sql'}
    foreach ($file in $files) {
    	$fname = $file.Name
    	$sql = get-content "$basedir$fname"
    	$qry = @"
    	$qry = $qry.Replace('GOTO','OTOG')
    	$qry = $qry.Replace('GO','')
    	$qry = $qry.Replace('OTOG','GOTO')
    	invoke-sqlcmd -ServerInstance $inst -Database master -Query $qry

    Now the jobs are loaded on the new server, but of course the problem is that they're ACTIVE. Until we go live we don't want them running, so we need to disable them en masse. That's pretty simple as well.

    $jobs = $s.JobServer.Jobs 
    foreach ($job in $jobs) {$job.IsEnabled = $False; $job.Alter()}

    And now the jobs are all in place, and ready to be enabled once the green light is given on the migration.


  • System Inventory and Performance Gathering SQL Connections Session

    Yesterday I delivered a session on maintaining a system inventory and gathering performance data with PowerShell.  The session went well and I had great questions and I'd like to thank everyone who attended the session.  I've attached the slides and demo scripts from the session for your use.


  • PowerShell Precon session at SQL Connections

    Yesterday I had the privilege of presenting the full day training session SPR304-Automate and Manage SQL Server with PowerShell at the SQL Connections conference in Las Vegas.  The session went very well (at least from my perspective) and I think the attendees enjoyed it as well.

    Just the day before the session I got excited about some features of PowerShell I hadn't played with much and decided to add a discussion of them to the presentation, so the material the conference gave them doesn't include that material.  (I also had a minor amount of demo-fail, so I've corrected those scripts as well.)

    I've attached the presentation deck and all the demo scripts to this post, but please, don't ask me for the password. If you were in the room yesterday you know what it is, and as the beer commercial says, "this one's for you."

    Thanks for attending!


  • Speaking About SQL Server

    There's a lot of excitement in the SQL Server world right now, with the RTM (Release to Manufacturing) release of SQL Server 2012, and the availability of SQL Server Data Tools (SSDT). My personal speaking schedule has exploded as well. Just this past Saturday I presented a session called Gather SQL Server Performance Data with PowerShell.

    There are a lot of events coming up, and I hope to see you at one or more of them. Here's a list of what's scheduled so far:

    First, I'll be presenting a session called Automate Policy-Based Management Using PowerShell at the 24 Hours of PASS on March 21.

    The Policy-Based Management feature in SQL Server 2008 provides a great way to ensure your systems are configured consistently and correctly, but it can be tedious to implement on each server in your environment. PowerShell scripts let you automate the implementation of your policies so that you can focus on more important problems. This session walks you through how PBM works, how to define your policies in PowerShell, and how to set up Agent jobs that evaluate those policies regularly and let you know when you need to take action.


    The following week, I'm really excited to be one of the featured presenters at the SQL Connections conference in Las Vegas. This event is also one of the official launch events for SQL Server 2012. At this event I'll be presenting an all day workshop called SPR304: Automate and Manage SQL Server with PowerShell. With PowerShell becoming more and more a key tool in managing all server systems, this workshop will get you up to speed quickly, ready to take control of SQL Server.

    This soup-to-nuts all day workshop will first introduce you to PowerShell, after which you’ll learn the basic SMO object model, how to manipulate data with PowerShell and how to use SMO to manage objects. We’ll then move on to creating Policy-Based Management policies, work with the Central Management Server, manage your system inventory and gather performance data with PowerShell. We’ll wrap up with a look at the new PowerShell cmdlets introduced for SQL Server 2012 and how you can use PowerShell to manage SQL Server 2012 in server environments including the new Always On technology and Windows Server Core. After this one day, you’ll be ready to go to work and able to use PowerShell to make you truly effective.
    In addition, I'll be doing three regular conference sessions.
    SQL226: Scalable Application Design with Service Broker. How your business applications interact with the corporate database directly can have a dramatic impact on how successful you are, and one of the biggest issues is performance with large databases. By designing the application to use asynchronous messaging for non-critical updates you can focus performance concerns to where the timeliness is critical. SQL Server Service Broker handles the asynchronous requests quickly and efficiently, and minimizes the conflicts with your critical updates. This session will walk you through the steps to get Service Broker up and running and providing a consistent delivery of your business data.
    SQL325: Maintain SQL Server System and Performance Data with PowerShell. Maintaining a solid set of information about our servers and their performance is critical when issues arise, and often help us see a problem before it occurs. Building a baseline of performance metrics allows us to know when something is wrong and help us to track it down and fix the problem. This session will walk you through a series of PowerShell scripts you can schedule which will capture the most important data and a set of reports to show you how to use that data to keep your server running smoothly.
    SQL427: Manage SQL Server 2012 on Windows Server Core with PowerShell Windows Server 2008 introduced Server Core, the operating system without the Graphical User Interface. SQL Server 2012 is the first version of SQL Server to support Server Core. To gain the maximum efficiency from your servers you want to make use of Server Core and with PowerShell you can effectively manage SQL Server in that environment. This session will demonstrate how to manage SQL Server 2012 on Windows Server Core and provide scripts to help you perform the common tasks you need to perform on that platform.


    On April 14, I'll be in Atlanta at SQL Saturday 111 in Atlanta, presenting the Manage SQL Server 2012 on Windows Server Core with PowerShell session from SQL Connnections.

    The Akron-Canton Chapter of AITP invited me to speak at their montly meeting on April 24, where I'll be presenting Scalable Application Design with Service Broker.

    I was excited to be selected for this year's PASS SQL Rally conference in Dallas May 10-11, where I'll be presenting the Manage SQL Server 2012 on Windows Server Core with PowerShell session as well.

    Finally, and I am really excited about this opportunity, SQL Sentry asked me to represent them on this May's SQL Cruise to Alaska. The SQL Cruise is a great opportunity for professionals to get focused training on SQL Server from some of the top professionals in the SQL Server industry. If you look at the cost per training day, including hotel and meals, and compare it to the cruise, you'll find that the cruise works out as a better price than traditional training, and you'll get to see some of the most beautiful landscapes in the country! I'll be presenting a four hour version of my PowerShell workshop, and I'd love to see you there.

    Well, that's what's on the schedule for the spring. The summer will be a bit less busy, but I plan to get to a few events over the summer as well. I hope to see you at some of these great training events!


  • Set and Verify the Retention Value for Change Data Capture

    Last summer I set up Change Data Capture for a client to track changes to their application database to apply those changes to their data warehouse. The client had some issues a short while back and felt they needed to increase the retention period from the default 3 days to 5 days. I ran this query to make that change:

    sp_cdc_change_job @job_type='cleanup', @retention=7200

    The value 7200 represents the number of minutes in a period of 5 days. All was well, but they recently asked how they can verify that the change was made. Well, it turns out the values are stored in the msdb database, in a table called dbo.cdc_jobs. I put together this quick query to allow them to test any database with CDC enabled, to find the retention value:

    SELECT [retention]
      FROM [msdb].[dbo].[cdc_jobs]
      WHERE [database_id] = DB_ID()
      AND [job_type] = 'cleanup'

    It's simple, but without knowing where to look, it can be confusing.


  • Load Perfmon Log Data into SQL Server with PowerShell

    In yesterday's post I showed you how I set up performance data gathering in an environment where I can't directly update my analysis database. Today I'll show you how I load that data into SQL Server for the analysis.

    First, I create tables to contain the performance counters in my analysis database. Here's what they look like:

    CREATE TABLE [Analysis].[ServerStats] (
        ServerID    int IDENTITY(1,1) NOT NULL,
        ServerNm    varchar(30) NOT NULL,
        PerfDate    datetime NOT NULL,
        PctProc     decimal(10,4) NOT NULL,
        Memory      bigint NOT NULL,
        PgFilUse    decimal(10,4) NOT NULL,
        DskSecRd    decimal(10,4) NOT NULL,
        DskSecWrt   decimal(10,4) NOT NULL,
        ProcQueLn   int NOT NULL
            [ServerID] ASC
    CREATE TABLE [Analysis].[InstanceStats] (
        InstanceID  int IDENTITY(1,1) NOT NULL,
        ServerID    int NOT NULL,
        ServerNm    varchar(30) NOT NULL,
        InstanceNm  varchar(30) NOT NULL,
        PerfDate    datetime NOT NULL,
        FwdRecSec   decimal(10,4) NOT NULL,
        PgSpltSec   decimal(10,4) NOT NULL,
        BufCchHit   decimal(10,4) NOT NULL,
        PgLifeExp   int NOT NULL,
        LogGrwths   int NOT NULL,
        BlkProcs    int NOT NULL,
        BatReqSec   decimal(10,4) NOT NULL,
        SQLCompSec  decimal(10,4) NOT NULL,
        SQLRcmpSec  decimal(10,4) NOT NULL
            [InstanceID] ASC
    ALTER TABLE [Analysis].[InstanceStats] WITH CHECK ADD  CONSTRAINT [FX_InstanceStats] FOREIGN KEY([ServerID])
    REFERENCES [Analysis].[ServerStats] ([ServerID])
    ALTER TABLE [Analysis].[InstanceStats] CHECK CONSTRAINT [FX_InstanceStats] 

    I also use stored procedures to load the data, so here's that code (pretty straight CRUD process):

    IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[Analysis].[insServerStats]') AND type in (N'P', N'PC'))
        DROP PROCEDURE [Analysis].[insServerStats]
    CREATE PROCEDURE [Analysis].[insServerStats]
               (@ServerID       int OUTPUT
               ,@ServerNm       varchar(30) = NULL
               ,@PerfDate       datetime = NULL
               ,@PctProc        decimal(10,4) = NULL
               ,@Memory     bigint = NULL
               ,@PgFilUse       decimal(10,4) = NULL
               ,@DskSecRd       decimal(10,4) = NULL
               ,@DskSecWrt      decimal(10,4) = NULL
               ,@ProcQueLn      int = NULL)
        DECLARE @ServerOut table( ServerID int);
        INSERT INTO [Analysis].[ServerStats]
        OUTPUT INSERTED.ServerID INTO @ServerOut
        SELECT @ServerID = ServerID FROM @ServerOut
    IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[Analysis].[insInstanceStats]') AND type in (N'P', N'PC'))
        DROP PROCEDURE [Analysis].[insInstanceStats]
    CREATE PROCEDURE [Analysis].[insInstanceStats]
               (@InstanceID     int OUTPUT
               ,@ServerID       int = NULL
               ,@ServerNm       varchar(30) = NULL
               ,@InstanceNm     varchar(30) = NULL
               ,@PerfDate       datetime = NULL
               ,@FwdRecSec      decimal(10,4) = NULL
               ,@PgSpltSec      decimal(10,4) = NULL
               ,@BufCchHit      decimal(10,4) = NULL
               ,@PgLifeExp      int = NULL
               ,@LogGrwths      int = NULL
               ,@BlkProcs       int = NULL
               ,@BatReqSec      decimal(10,4) = NULL
               ,@SQLCompSec     decimal(10,4) = NULL
               ,@SQLRcmpSec     decimal(10,4) = NULL)
        DECLARE @InstanceOut table( InstanceID int);
        INSERT INTO [Analysis].[InstanceStats]
        OUTPUT INSERTED.InstanceID INTO @InstanceOut
        SELECT @InstanceID = InstanceID FROM @InstanceOut

    Now the PowerShell. The target server may be a named instance, so I need to split the instance name on the backslash character, like I showed you yesterday. In this case, though, I'll use the name "MSSQLSERVER" for the instance name because that's the name of the default instance everywhere in SQL Server Land except for the performance counters. Once I've got my instance name ($instnm) and machine name ($mach) then I'm ready to pull in the log file from perfmon. The file is a binary file, but PowerShell 2.0 introduced a new cmdlet called Import-Counter which reads that file format nicely. I'll use that cmdlet and assign the results to the $logcntr variable.

    $srv = $inst.Split("\")
    if ($srv.Count -eq 1) {
    	$instnm = 'MSSQLSERVER'
    	else {
    	$instnm = $srv[1]
    $mach = $srv[0]
    $logcntr = import-counter $logfil

    We have to iterate through the rows in the counter object, and each row has a property called Timestamp, which is the date and time the counter was taken, and a collection called CounterSamples. This contains all the counters gathered at that time. We'll loop through that looking at the Path property to find the counter and match it with the variable we want to use to store the value for the counter. Once we find it (using wildcards, so we don't have to worry about service names) we assign the CookedValue property to the variable we're using for that counter.

    foreach ($cntr in $logcntr) {
    	$dtm = $cntr.Timestamp
    	$clist = $cntr.CounterSamples
    	foreach ($sample in $clist) {
    		switch -wildcard ($sample.Path) {
    			'*% Processor Time*' { $ppt = $sample.CookedValue }
    			'*Available MBytes*' { $mab = $sample.CookedValue }
    			'*Paging File*' { $pfu = $sample.CookedValue }
    			'*Avg. Disk sec/Read*' { $drs = $sample.CookedValue }
    			'*Avg. Disk sec/Write*' { $dws = $sample.CookedValue }
    			'*Processor Queue Length*' { $pql = $sample.CookedValue }
    			'*Forwarded Records/sec*' { $frs = $sample.CookedValue }
    			'*Page Splits/sec*' { $pss = $sample.CookedValue }
    			'*Buffer cache hit ratio*' { $bch = $sample.CookedValue }
    			'*Page life expectancy*' { $ple = $sample.CookedValue }
    			'*Log Growths*' { $log = $sample.CookedValue }
    			'*Processes blocked*' { $blk = $sample.CookedValue }
    			'*Batch Requests/sec*' { $brs = $sample.CookedValue }
    			'*SQL Compilations/sec*' { $cmp = $sample.CookedValue }
    			'*SQL Re-Compilations/sec*' { $rcm = $sample.CookedValue }

    Once we've got the counters correctly loaded into variables we can use the stored procedures created above to load the data into the analysis database. Note that I'm using the Invoke-SQLCMD cmdlet so you'll have to make sure that's been loaded. Also note that I'm returning the identity value from the ServerStats insert into the InstanceStats table so the two can be properly joined in your analysis queries. This design allows me to monitor multiple instances on the same machine and have the counters all properly synchronized when necessary. The @ServerID parameter in the insServerStats procedure is an output parameter, and by assigning the output from the first invoke-sqlcmd cmdlet to a variable called $res (for 'results'), I can get the value by examining the ServerID property of the $res variable.

    	#Send the machine counters to our database
    	$q = "declare @ServerID int; exec [Analysis].[insServerStats]"
    	$q = $q + " @ServerID OUTPUT"
    	$q = $q + ", @ServerNm='" + [string]$mach + "'"
    	$q = $q + ", @PerfDate='" + [string]$dtm + "'"
    	$q = $q + ", @PctProc=" + [string]$ppt
    	$q = $q + ", @Memory=" + [string]$mab
    	$q = $q + ", @PgFilUse=" + [string]$pfu
    	$q = $q + ", @DskSecRd=" + [string]$drs
    	$q = $q + ", @DskSecWrt=" + [string]$dws
    	$q = $q + ", @ProcQueLn=" + [string]$pql
    	$q = $q + "; select @ServerID as ServerID"
    	#write-output $q
    	$res = invoke-sqlcmd -ServerInstance $sqlsrv -Database $destdb -Query $q
    	$SrvID = $res.ServerID
    	#Send the instance counters to the database
    	$q = "declare @InstanceID int; exec [Analysis].[insInstanceStats]"
    	$q = $q + " @InstanceID OUTPUT"
    	$q = $q + ", @ServerID=" + [string]$SrvID
    	$q = $q + ", @ServerNm='" + [string]$mach + "'"
    	$q = $q + ", @InstanceNm='" + [string]$instnm + "'"
    	$q = $q + ", @PerfDate='" + [string]$dtm + "'"
    	$q = $q + ", @FwdRecSec=" + [string]$frs
    	$q = $q + ", @PgSpltSec=" + [string]$pss
    	$q = $q + ", @BufCchHit=" + [string]$bch
    	$q = $q + ", @PgLifeExp=" + [string]$ple
    	$q = $q + ", @LogGrwths=" + [string]$log
    	$q = $q + ", @BlkProcs=" + [string]$blk
    	$q = $q + ", @BatReqSec=" + [string]$brs
    	$q = $q + ", @SQLCompSec=" + [string]$cmp
    	$q = $q + ", @SQLRcmpSec=" + [string]$rcm
    	$q = $q + "; select @InstanceID as InstanceID"
    	#write-output $q
    	$res = invoke-sqlcmd -ServerInstance $sqlsrv -Database $destdb -Query $q

    After running this script for each of your perfmon log files you'll be able to query that data, do aggregate analysis, and more easily find where you're having performance problems, as well as get to know what your baseline is.


  • Setup Perfmon with PowerShell and Logman

    In a session yesterday I was reminded of the power of a tool on every server, Logman, to manage performance data gathering. It's a pretty amazing tool, and I'm starting to get to know how much more it can do than I've even considered, but here's a simple example of what I've done so far.

    First things first, I'm going to leverage the SMO Server object to get the default backup directory, so I can put the perfmon output into a directory where I know it's going to be backed up. The Server.Settings.BackupDirectory property has the full path for the default backup location, so let's grab that.

    $inst = 'SQLTBWS\INST01' # I get this from a command-line argument, but hard coded it for completeness
    $s = new-object ('Microsoft.SqlServer.Management.Smo.Server') $inst
    $bkup = $s.Settings.BackupDirectory

    Next, I'm going to check to see if a subdirectory called SQLPerf under that backup directory exists, and if not, create it.

    # Create the evaluation results directory if it doesn't exist
    if (!(Test-Path -path "$bkup\SQLPerf\"))
    	New-Item "$bkup\SQLPerf\" -type directory | out-null

    Because I get the instance name from a script argument, I don't know if it's a default or named instance. This matters when defining perfmon counters to capture, so we'll "Split" on the backslash character, and if there isn't a second object after the split, the instance is a default instance. In that case the $sqlcntr variable is set to "SQLServer", but for a named instance we need to set $sqlcntr to the instance service name, which is "MSSQL$xxxxx", where xxxxx is the instance name after the backslash. Additionally store the machine name in the $boxnm variable so I can create the output files using that name.

    # Get the instance name to determine if it's a default instance
    $srv = $inst.Split("\")
    if ($srv.Count -eq 1) {
    	$sqlcntr = 'SQLServer'
    	$boxnm = $inst
    	else {
    	$sqlcntr = 'MSSQL$' + $srv[1]
    	$boxnm = $srv[0]

    Next, I'll create a 'here-string' containing the configuration properties to send to Logman. Those properties consist of a list of the counters I want to capture. The only thing that's "special" here is the construct "$($sqlcntr)", which allows PowerShell to replace that construct with the counter service name, and allows us to specify valid counters for perfmon. Once we've created the string in the $cfg variable, we can pipe that to the out-file cmdlet, directing it to the machine name config file in the SQLPerf directory under the default backup directory.

    $cfg = @"
    \Processor(_Total)\% Processor Time
    \Memory\Available MBytes
    \Paging File(_Total)\% Usage
    \PhysicalDisk(_Total)\Avg. Disk sec/Read
    \PhysicalDisk(_Total)\Avg. Disk sec/Write
    \System\Processor Queue Length
    \$($sqlcntr):Access Methods\Forwarded Records/sec
    \$($sqlcntr):Access Methods\Page Splits/sec
    \$($sqlcntr):Buffer Manager\Buffer cache hit ratio
    \$($sqlcntr):Buffer Manager\Page life expectancy
    \$($sqlcntr):Databases(_Total)\Log Growths
    \$($sqlcntr):General Statistics\Processes blocked
    \$($sqlcntr):SQL Statistics\Batch Requests/sec
    \$($sqlcntr):SQL Statistics\SQL Compilations/sec
    \$($sqlcntr):SQL Statistics\SQL Re-Compilations/sec
    $cfg | out-file "$bkup\SQLPerf\$boxnm.config" -encoding ASCII

    Now we've got the config file that logman needs to configure perfmon to our specifications. I call the collection name 'SQLPerf', and you can see the command line options in the TechNet article here. Once the collection is created we run logman again and issue the start command on the collection.

    logman create counter SQLPerf -f bincirc -max 100 -si 60 --v -o "$bkup\SQLPerf\$boxnm" -cf "$bkup\SQLPerf\$boxnm.config"
    logman start SQLPerf

    Now, we don't want this to run forever, so once a day (usually at about 5am) I stop perfmon, move the counter data to a dated subdirectory, then start up perfmon again.

    # Stop Perfmon Counter log
    logman stop SQLPerf
    # Get the current date/time and create the dated subdirectory
    $dt = get-date -format yyyyMMddHHmmss
    New-Item "$bkup\SQLPerf\$boxnm$dt" -type Directory | out-null
    # Move trace files into a dated subdirectory
    Move-Item ("$bkup\SQLPerf\$boxnm*.blg") ("$bkup\SQLPerf\$boxnm$dt")
    # Start Perfmon Counter log
    logman start SQLPerf

    These steps allow me to create a consistent method of gathering performance data where the gathering process can't directly update my analysis database.


  • Getting Job History Correctly

    In my last blog post I walked through a way to grab the duration from the EnumHistory method of the JobServer/Job object. As I worked through getting a solution in place for a client I found that the duration calculation was only part of the problem.

    First, when EnumHistory() is invoked, it runs the sp_help_jobhistory stored procedure. You can create a filter object which will supply the filters you want to the results, but I didn't do that. Given that it's invoked in my script for each job on a server, what I found, surprisingly, was that for each job it called the stored procedure, sending the results into a temporary table, and then selects from that temp table the rows where the job ID is that of the current job. So, it's returning the results for ALL jobs, every time, for each job. If you've got a lot of jobs on your server, that's a lot of data it's returning unnecessarily.

    Another problem I found is when I want to get all history that occurred since a given point in time. The sp_help_jobhistory procedure has parameters to select start date and start time, so it would seem that populating those values would yield the desired results, right? No, it filters them as separate columns, and what you'll get are the history rows where the job was run on or since the specified date, but only those rows where the start time of the job occurred after the specified start time, on any qualifying date. So, if the start date is February 22, 2012 at 22:00, then a job that ran on February 27, 2012 at 17:00 would not appear in your result set.

    (By the way, there's also parameters to the sp_help_jobhistory procedure for JobID and for JobName. Why the EnumHistory method didn't bother to use these is beyond me.)

    In any event, to get the history that occurred since a point in time, in PowerShell invoke the get-date cmdlet, once for the date and once for the time, casting the time as an integer variable, like this (where $gatherdt contains the date/time for your starting point):

    $strtdt = get-date $gatherdt -format yyyyMMdd
    [int]$strttm = get-date $gatherdt -format HHmmss

    Then you can call the sp_help_jobhistory procedure, like this:

    $q = "exec msdb.dbo.sp_help_jobhistory @mode='FULL',@oldest_first=1,@start_run_date='$strtdt'"
    $jobhist = invoke-sqlcmd -ServerInstance $inst -Database 'msdb' -query $q

    You'll get history for all jobs, but only since 00:00 on the starting date. You can then test the results against the start time by ensuring that on the start date, you only process the rows received after the start time like this:

    #Now extract from the job history the results of the jobs run since the last gather date
    [int]$strtdt = get-date $gatherdt -format yyyyMMdd
    $jobhist | foreach {
    	# Filter data occurring before last gather date
    	$ignore = $True
    	$jdt = $_.run_date
    	$jdtt = $_.run_time
    	if (($jdt -gt $strtdt) -or (($jdt -eq $strtdt) -and ($jdtt -gt $strttm))) {
    		$ignore = $False

    (I recast the start date as an integer because that's how the data is returned from the proc.) For each row I set the $ignore variable to $True, get the incoming date and time, then test to see that the date is greater than the start date or that the date is equal to the start date and the time is greater than the start time. If it passes those tests I set $ignore to $False, and that allows me to then process then incoming row.


This Blog


Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement