THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

Allen White

  • SQLCruise Alaska was Amazing

    You'd think that providing in-depth SQL Server training on a cruise ship would be an excuse for a vacation disguised as a business trip, but you'd be wrong. This past week I traveled with the founders of SQLCruise, Tim Ford and Brent Ozar, along with other top professionals in the SQL Server world - Jeremiah Peschka, Kendra Little, Kevin Kline and Robert Davis - and me.

    The week began with Brent presenting a session on Plan Cache Analysis, which I plan to start using very soon. After Brent, Kevin presented a four-hour leadership training session. He's recorded a set of training videos which will really help IT professionals make the transition to management.

    After spending the day snorkeling in Ketchikan, Alaska with Snorkel Alaska, I presented a short session on SQL Sentry's Plan Explorer and announced Wayne Sheffield as the winner of the Power Suite from SQL Sentry for having the "hairiest execution plan". SQL Sentry paid for my cruise and travel as their representative, and I'm really grateful to them for the opportunity. They have a great suite of products, and they're great people, and that makes me proud to recommend them, as well as to represent them. Following the contest, Jeremiah did an interesting session called "Killer Techniques for Database Performance" where he went into really great detail about the real cost of accessing data in applications.

    Tuesday was mostly a sightseeing day, as we spent the day in Juneau, where I did a tour called "Glacier View Bike and Brew", put together by Cycle Alaska. A van picked us up and took us north to the University of Alaska Southeast on Auke Lake, and we rode nine miles on a bike to Mendenhall Lake where we got great views of Mendenhall Glacier, then on to the glacier visitor center. Then they drove us into Juneau to a pub where we had a beer tasting of the different blends from the Alaskan Brewing Company. We reboarded the ship and cruised out to Sawyer Glacier to see that spectacular site, and on the way participated in a storytelling session in Brent's suite. Some great stories were told, and Darcy Williams won an Amazon Kindle Fire for the best IT story told.

    Wednesday was spent in Skagway, where Tim arranged a van tour up through the White Pass which the gold rush prospectors had to traverse to get to the fields near Dawson City in the Yukon Territory. We followed that up with lunch and beverages at the Skagway Brewing Company, or "BrewCo", as they call it locally.

    After taking advantage of sightseeing primarily for three days, we got back into the serious training on Thursday, where I did a shortened version of my PowerShell pre-con. It's hard to get a real comprehensive "PowerShell for SQL Server" session in four hours, but I think I covered it well enough to get people familiar enough to get real work done, and showed some advanced features to show them what they can do once they get up to speed. After my session, Kendra did a session called "Sports Medicine For High Transaction Databases" where she talked about benchmarking and methods to help your high volume transactional systems to perform under heavy load. The day ended with Robert doing a session on "Recovering Data and Databases" where he talked about how data is stored, how it's backed up, and how to recover from corruption.

    The last day of the cruise started with Brent, this time talking about Scaling SQL with Solid State, where I learned that even putting solid state drives in place may not solve I/O subsystem performance problems, and how to plan for the best performance using them. We followed that up with the Idera Iditarod Challenge, where the cruisers had to answer SQL questions and perform additional challenges, and winners Malathi Mahadevan and Max Gomeniouk each won an Amazon Kindle Fire from Idera. We ended the day visiting Victoria, BC.

    The best part of the cruise, aside from the great training, was getting to know the cruisers and spending real time with them. It's often hard when working as a database professional because we seldom work in a team, and it can be difficult to know what questions to ask when presented with a problem. This training is unique because we spend time together, not just in class, but visiting the sights, and talking about SQL. After a week together we all have new people we can go to with questions and ideas, to see if they're valid or to get problems solved. I'm truly impressed with the work that Tim does in making the cruise a success for everyone, and I hope you'll consider signing up for one of next year's cruises. Check it out at SQLCruise.com

    Allen

  • SQL Rally Presentations

    As I drove to Dallas for this year's SQL Rally conference (yes, I like to drive) I got a call asking if I could step in for another presenter who had to cancel at the last minute.  Life happens, and it's best to be flexible, and I said sure, I can do that. Which presentation would you like me to do? (I'd submitted a few presentations, so it wasn't a problem.)

    So yesterday I presented "Gathering Performance Metrics With PowerShell" at 8:45AM, and my newest presentation, "Manage SQL Server 2012 on Windows Server Core with PowerShell" at 1PM at the conference here in Dallas.  The room was packed for both sessions and I truly appreciate every one of you who attended and there were great questions in both sessions.

     I've attached a zip file containing the slide decks from both presentations along with the demo scripts.  I hope you get as much out of using PowerShell with SQL Server as I have, and Thanks!

     Allen

  • Script and Migrate Agent Jobs between Servers using PowerShell

    I'm nearing the end of a SQL Server upgrade project, migrating from SQL Server 2000 to SQL Server 2008 R2. The company uses SQL Server Agent to run about 150 jobs which execute DTS packages to do the work of moving data around. Now, there are any number of ways to move these jobs from one server to the other, but I wanted to be able to exclude certain jobs, make some minor changes before loading them onto the new server, that kind of thing. PowerShell makes that relatively easy to do.

    First, most of the jobs were owned by some login, many of which represent people no longer with the company. This happens over time, but should be cleaned up. To quickly modify the jobs to all be owned by 'sa', it's just a few lines of code:

    # Connect to the instance using SMO
    $s = new-object ('Microsoft.SqlServer.Management.Smo.Server') $inst
    
    # Iterate through the jobs to reset the job owner
    foreach ($job in $s.JobServer.Jobs) {
    	if ($job.Owner -ne 'sa') {
    		$job.OwnerLoginName = 'sa'
    		$job.Alter()
    		}
    	}

    My scripts usually pass in $inst as the SQL Server instance name as a command-line argument, so I can use the script on any server. Once this script was done all the jobs are owned by sa, and I don't have to worry about enabling old logins on the new server. Next thing is to script out the jobs into .sql files, one for each job. In this case I pass in a base directory path as a second command-line argument. Then I use PowerShell to create the Jobs directory under that (note that if you're using a named instance, I'd recommend using the encode-sqlname cmdlet to turn the $inst variable into an encoded name to prevent the backslash from creating problems):

    # Create the Database root directory if it doesn't exist
    if (!(Test-Path -path "$base\$inst\Jobs\"))
    	{
    	New-Item "$base\$inst\Jobs\" -type directory | out-null
    	}
    $homedir = "$base\$inst\Jobs\"
    

    The next step is to set the properties for the script files:

    # Instantiate the Scripter object and set the base properties
    $scrp = new-object ('Microsoft.SqlServer.Management.Smo.Scripter') ($s)
    $scrp.Options.ScriptDrops = $False
    $scrp.Options.WithDependencies = $False
    $scrp.Options.IncludeHeaders = $True
    $scrp.Options.AppendToFile = $False
    $scrp.Options.ToFileOnly = $True
    $scrp.Options.ClusteredIndexes = $True
    $scrp.Options.DriAll = $True
    $scrp.Options.Indexes = $True
    $scrp.Options.Triggers = $True
    

    Then we can cycle through the jobs and create scripts for each job on the server.

    # Create the script file for each job
    foreach ($job in $s.JobServer.Jobs) {
    	$jobname = $job.Name
    	$scrp.Options.FileName = "$homedir\$jobname.sql"
    	$scrp.Script($job)
    	}
    

    There are some jobs on the original server that won't move - things like maintenance plan jobs, which I'm replacing with more efficient methods, so having individual script files allows me to simply delete the script files for the jobs not moving to the new server. Once I've done that I have another script which will load the script file into a PowerShell variable, and using invoke-sqlcmd, create the job on the new server. Now, each script file ends with a 'GO' statement, and invoke-sqlcmd doesn't like that, so I want to strip it out, which is easy enough to do with the Replace method. The problem is that it also contains a GOTO command to bypass the error notification. I use the Replace() method to get around that by first replacing GOTO with OTOG, then replacing GO with an empty string, then replacing OTOG again with GOTO.

    The other 'tricky' thing I had to do was to convert the script file to a PowerShell string. I use the Get-Content cmdlet to pull the script into a PowerShell object. I then use the here-string construct with the PowerShell object inside the here-string, and that converts it from an object to a System.String object for me, and then I can easily supply that string as the query for invoke-sqlcmd.

    # Get all the files with a .sql extension in the directory and cycle through them
    $files = get-childitem $basedir | where {$_.Name -like '*.sql'}
    foreach ($file in $files) {
    	$fname = $file.Name
    	$sql = get-content "$basedir$fname"
    	$qry = @"
    $sql
    "@
    	$qry = $qry.Replace('GOTO','OTOG')
    	$qry = $qry.Replace('GO','')
    	$qry = $qry.Replace('OTOG','GOTO')
    	invoke-sqlcmd -ServerInstance $inst -Database master -Query $qry
    	}
    

    Now the jobs are loaded on the new server, but of course the problem is that they're ACTIVE. Until we go live we don't want them running, so we need to disable them en masse. That's pretty simple as well.

    $jobs = $s.JobServer.Jobs 
    foreach ($job in $jobs) {$job.IsEnabled = $False; $job.Alter()}
    

    And now the jobs are all in place, and ready to be enabled once the green light is given on the migration.

    Allen

  • System Inventory and Performance Gathering SQL Connections Session

    Yesterday I delivered a session on maintaining a system inventory and gathering performance data with PowerShell.  The session went well and I had great questions and I'd like to thank everyone who attended the session.  I've attached the slides and demo scripts from the session for your use.

    Allen

  • PowerShell Precon session at SQL Connections

    Yesterday I had the privilege of presenting the full day training session SPR304-Automate and Manage SQL Server with PowerShell at the SQL Connections conference in Las Vegas.  The session went very well (at least from my perspective) and I think the attendees enjoyed it as well.

    Just the day before the session I got excited about some features of PowerShell I hadn't played with much and decided to add a discussion of them to the presentation, so the material the conference gave them doesn't include that material.  (I also had a minor amount of demo-fail, so I've corrected those scripts as well.)

    I've attached the presentation deck and all the demo scripts to this post, but please, don't ask me for the password. If you were in the room yesterday you know what it is, and as the beer commercial says, "this one's for you."

    Thanks for attending!

    Allen

  • Speaking About SQL Server

    There's a lot of excitement in the SQL Server world right now, with the RTM (Release to Manufacturing) release of SQL Server 2012, and the availability of SQL Server Data Tools (SSDT). My personal speaking schedule has exploded as well. Just this past Saturday I presented a session called Gather SQL Server Performance Data with PowerShell.

    There are a lot of events coming up, and I hope to see you at one or more of them. Here's a list of what's scheduled so far:

    First, I'll be presenting a session called Automate Policy-Based Management Using PowerShell at the 24 Hours of PASS on March 21.

    The Policy-Based Management feature in SQL Server 2008 provides a great way to ensure your systems are configured consistently and correctly, but it can be tedious to implement on each server in your environment. PowerShell scripts let you automate the implementation of your policies so that you can focus on more important problems. This session walks you through how PBM works, how to define your policies in PowerShell, and how to set up Agent jobs that evaluate those policies regularly and let you know when you need to take action.

     

    The following week, I'm really excited to be one of the featured presenters at the SQL Connections conference in Las Vegas. This event is also one of the official launch events for SQL Server 2012. At this event I'll be presenting an all day workshop called SPR304: Automate and Manage SQL Server with PowerShell. With PowerShell becoming more and more a key tool in managing all server systems, this workshop will get you up to speed quickly, ready to take control of SQL Server.

    This soup-to-nuts all day workshop will first introduce you to PowerShell, after which you’ll learn the basic SMO object model, how to manipulate data with PowerShell and how to use SMO to manage objects. We’ll then move on to creating Policy-Based Management policies, work with the Central Management Server, manage your system inventory and gather performance data with PowerShell. We’ll wrap up with a look at the new PowerShell cmdlets introduced for SQL Server 2012 and how you can use PowerShell to manage SQL Server 2012 in server environments including the new Always On technology and Windows Server Core. After this one day, you’ll be ready to go to work and able to use PowerShell to make you truly effective.
    In addition, I'll be doing three regular conference sessions.
    SQL226: Scalable Application Design with Service Broker. How your business applications interact with the corporate database directly can have a dramatic impact on how successful you are, and one of the biggest issues is performance with large databases. By designing the application to use asynchronous messaging for non-critical updates you can focus performance concerns to where the timeliness is critical. SQL Server Service Broker handles the asynchronous requests quickly and efficiently, and minimizes the conflicts with your critical updates. This session will walk you through the steps to get Service Broker up and running and providing a consistent delivery of your business data.
    SQL325: Maintain SQL Server System and Performance Data with PowerShell. Maintaining a solid set of information about our servers and their performance is critical when issues arise, and often help us see a problem before it occurs. Building a baseline of performance metrics allows us to know when something is wrong and help us to track it down and fix the problem. This session will walk you through a series of PowerShell scripts you can schedule which will capture the most important data and a set of reports to show you how to use that data to keep your server running smoothly.
    SQL427: Manage SQL Server 2012 on Windows Server Core with PowerShell Windows Server 2008 introduced Server Core, the operating system without the Graphical User Interface. SQL Server 2012 is the first version of SQL Server to support Server Core. To gain the maximum efficiency from your servers you want to make use of Server Core and with PowerShell you can effectively manage SQL Server in that environment. This session will demonstrate how to manage SQL Server 2012 on Windows Server Core and provide scripts to help you perform the common tasks you need to perform on that platform.

     

    On April 14, I'll be in Atlanta at SQL Saturday 111 in Atlanta, presenting the Manage SQL Server 2012 on Windows Server Core with PowerShell session from SQL Connnections.

    The Akron-Canton Chapter of AITP invited me to speak at their montly meeting on April 24, where I'll be presenting Scalable Application Design with Service Broker.

    I was excited to be selected for this year's PASS SQL Rally conference in Dallas May 10-11, where I'll be presenting the Manage SQL Server 2012 on Windows Server Core with PowerShell session as well.

    Finally, and I am really excited about this opportunity, SQL Sentry asked me to represent them on this May's SQL Cruise to Alaska. The SQL Cruise is a great opportunity for professionals to get focused training on SQL Server from some of the top professionals in the SQL Server industry. If you look at the cost per training day, including hotel and meals, and compare it to the cruise, you'll find that the cruise works out as a better price than traditional training, and you'll get to see some of the most beautiful landscapes in the country! I'll be presenting a four hour version of my PowerShell workshop, and I'd love to see you there.

    Well, that's what's on the schedule for the spring. The summer will be a bit less busy, but I plan to get to a few events over the summer as well. I hope to see you at some of these great training events!

    Allen

  • Set and Verify the Retention Value for Change Data Capture

    Last summer I set up Change Data Capture for a client to track changes to their application database to apply those changes to their data warehouse. The client had some issues a short while back and felt they needed to increase the retention period from the default 3 days to 5 days. I ran this query to make that change:

    sp_cdc_change_job @job_type='cleanup', @retention=7200
    

    The value 7200 represents the number of minutes in a period of 5 days. All was well, but they recently asked how they can verify that the change was made. Well, it turns out the values are stored in the msdb database, in a table called dbo.cdc_jobs. I put together this quick query to allow them to test any database with CDC enabled, to find the retention value:

    SELECT [retention]
      FROM [msdb].[dbo].[cdc_jobs]
      WHERE [database_id] = DB_ID()
      AND [job_type] = 'cleanup'

    It's simple, but without knowing where to look, it can be confusing.

    Allen

  • Load Perfmon Log Data into SQL Server with PowerShell

    In yesterday's post I showed you how I set up performance data gathering in an environment where I can't directly update my analysis database. Today I'll show you how I load that data into SQL Server for the analysis.

    First, I create tables to contain the performance counters in my analysis database. Here's what they look like:

    CREATE TABLE [Analysis].[ServerStats] (
        ServerID    int IDENTITY(1,1) NOT NULL,
        ServerNm    varchar(30) NOT NULL,
        PerfDate    datetime NOT NULL,
        PctProc     decimal(10,4) NOT NULL,
        Memory      bigint NOT NULL,
        PgFilUse    decimal(10,4) NOT NULL,
        DskSecRd    decimal(10,4) NOT NULL,
        DskSecWrt   decimal(10,4) NOT NULL,
        ProcQueLn   int NOT NULL
        CONSTRAINT [PK_ServerStats] PRIMARY KEY CLUSTERED 
        (
            [ServerID] ASC
        )
    )
    GO
    
    CREATE TABLE [Analysis].[InstanceStats] (
        InstanceID  int IDENTITY(1,1) NOT NULL,
        ServerID    int NOT NULL,
        ServerNm    varchar(30) NOT NULL,
        InstanceNm  varchar(30) NOT NULL,
        PerfDate    datetime NOT NULL,
        FwdRecSec   decimal(10,4) NOT NULL,
        PgSpltSec   decimal(10,4) NOT NULL,
        BufCchHit   decimal(10,4) NOT NULL,
        PgLifeExp   int NOT NULL,
        LogGrwths   int NOT NULL,
        BlkProcs    int NOT NULL,
        BatReqSec   decimal(10,4) NOT NULL,
        SQLCompSec  decimal(10,4) NOT NULL,
        SQLRcmpSec  decimal(10,4) NOT NULL
        CONSTRAINT [PK_InstanceStats] PRIMARY KEY CLUSTERED 
        (
            [InstanceID] ASC
        )
    )
    GO
    ALTER TABLE [Analysis].[InstanceStats] WITH CHECK ADD  CONSTRAINT [FX_InstanceStats] FOREIGN KEY([ServerID])
    REFERENCES [Analysis].[ServerStats] ([ServerID])
    GO
    
    ALTER TABLE [Analysis].[InstanceStats] CHECK CONSTRAINT [FX_InstanceStats] 
    GO
    

    I also use stored procedures to load the data, so here's that code (pretty straight CRUD process):

    IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[Analysis].[insServerStats]') AND type in (N'P', N'PC'))
        DROP PROCEDURE [Analysis].[insServerStats]
    GO
    CREATE PROCEDURE [Analysis].[insServerStats]
               (@ServerID       int OUTPUT
               ,@ServerNm       varchar(30) = NULL
               ,@PerfDate       datetime = NULL
               ,@PctProc        decimal(10,4) = NULL
               ,@Memory     bigint = NULL
               ,@PgFilUse       decimal(10,4) = NULL
               ,@DskSecRd       decimal(10,4) = NULL
               ,@DskSecWrt      decimal(10,4) = NULL
               ,@ProcQueLn      int = NULL)
    AS
        SET NOCOUNT ON
        
        DECLARE @ServerOut table( ServerID int);
    
        INSERT INTO [Analysis].[ServerStats]
               ([ServerNm]
               ,[PerfDate]
               ,[PctProc]
               ,[Memory]
               ,[PgFilUse]
               ,[DskSecRd]
               ,[DskSecWrt]
               ,[ProcQueLn])
        OUTPUT INSERTED.ServerID INTO @ServerOut
            VALUES
               (@ServerNm
               ,@PerfDate
               ,@PctProc
               ,@Memory
               ,@PgFilUse
               ,@DskSecRd
               ,@DskSecWrt
               ,@ProcQueLn)
    
        SELECT @ServerID = ServerID FROM @ServerOut
        
        RETURN
    
    GO
    
    IF EXISTS (SELECT * FROM sys.objects WHERE object_id = OBJECT_ID(N'[Analysis].[insInstanceStats]') AND type in (N'P', N'PC'))
        DROP PROCEDURE [Analysis].[insInstanceStats]
    GO
    CREATE PROCEDURE [Analysis].[insInstanceStats]
               (@InstanceID     int OUTPUT
               ,@ServerID       int = NULL
               ,@ServerNm       varchar(30) = NULL
               ,@InstanceNm     varchar(30) = NULL
               ,@PerfDate       datetime = NULL
               ,@FwdRecSec      decimal(10,4) = NULL
               ,@PgSpltSec      decimal(10,4) = NULL
               ,@BufCchHit      decimal(10,4) = NULL
               ,@PgLifeExp      int = NULL
               ,@LogGrwths      int = NULL
               ,@BlkProcs       int = NULL
               ,@BatReqSec      decimal(10,4) = NULL
               ,@SQLCompSec     decimal(10,4) = NULL
               ,@SQLRcmpSec     decimal(10,4) = NULL)
    AS
        SET NOCOUNT ON
        
        DECLARE @InstanceOut table( InstanceID int);
    
        INSERT INTO [Analysis].[InstanceStats]
               ([ServerID]
               ,[ServerNm]
               ,[InstanceNm]
               ,[PerfDate]
               ,[FwdRecSec]
               ,[PgSpltSec]
               ,[BufCchHit]
               ,[PgLifeExp]
               ,[LogGrwths]
               ,[BlkProcs]
               ,[BatReqSec]
               ,[SQLCompSec]
               ,[SQLRcmpSec])
        OUTPUT INSERTED.InstanceID INTO @InstanceOut
        VALUES
               (@ServerID
               ,@ServerNm
               ,@InstanceNm
               ,@PerfDate
               ,@FwdRecSec
               ,@PgSpltSec
               ,@BufCchHit
               ,@PgLifeExp
               ,@LogGrwths
               ,@BlkProcs
               ,@BatReqSec
               ,@SQLCompSec
               ,@SQLRcmpSec)
    
        SELECT @InstanceID = InstanceID FROM @InstanceOut
        
        RETURN
    
    GO
    

    Now the PowerShell. The target server may be a named instance, so I need to split the instance name on the backslash character, like I showed you yesterday. In this case, though, I'll use the name "MSSQLSERVER" for the instance name because that's the name of the default instance everywhere in SQL Server Land except for the performance counters. Once I've got my instance name ($instnm) and machine name ($mach) then I'm ready to pull in the log file from perfmon. The file is a binary file, but PowerShell 2.0 introduced a new cmdlet called Import-Counter which reads that file format nicely. I'll use that cmdlet and assign the results to the $logcntr variable.

    $srv = $inst.Split("\")
    if ($srv.Count -eq 1) {
    	$instnm = 'MSSQLSERVER'
    	}
    	else {
    	$instnm = $srv[1]
    	}
    $mach = $srv[0]
    
    $logcntr = import-counter $logfil
    

    We have to iterate through the rows in the counter object, and each row has a property called Timestamp, which is the date and time the counter was taken, and a collection called CounterSamples. This contains all the counters gathered at that time. We'll loop through that looking at the Path property to find the counter and match it with the variable we want to use to store the value for the counter. Once we find it (using wildcards, so we don't have to worry about service names) we assign the CookedValue property to the variable we're using for that counter.

    foreach ($cntr in $logcntr) {
    	$dtm = $cntr.Timestamp
    	$clist = $cntr.CounterSamples
    	foreach ($sample in $clist) {
    		switch -wildcard ($sample.Path) {
    			'*% Processor Time*' { $ppt = $sample.CookedValue }
    			'*Available MBytes*' { $mab = $sample.CookedValue }
    			'*Paging File*' { $pfu = $sample.CookedValue }
    			'*Avg. Disk sec/Read*' { $drs = $sample.CookedValue }
    			'*Avg. Disk sec/Write*' { $dws = $sample.CookedValue }
    			'*Processor Queue Length*' { $pql = $sample.CookedValue }
    			'*Forwarded Records/sec*' { $frs = $sample.CookedValue }
    			'*Page Splits/sec*' { $pss = $sample.CookedValue }
    			'*Buffer cache hit ratio*' { $bch = $sample.CookedValue }
    			'*Page life expectancy*' { $ple = $sample.CookedValue }
    			'*Log Growths*' { $log = $sample.CookedValue }
    			'*Processes blocked*' { $blk = $sample.CookedValue }
    			'*Batch Requests/sec*' { $brs = $sample.CookedValue }
    			'*SQL Compilations/sec*' { $cmp = $sample.CookedValue }
    			'*SQL Re-Compilations/sec*' { $rcm = $sample.CookedValue }
    			 }
    		}
    

    Once we've got the counters correctly loaded into variables we can use the stored procedures created above to load the data into the analysis database. Note that I'm using the Invoke-SQLCMD cmdlet so you'll have to make sure that's been loaded. Also note that I'm returning the identity value from the ServerStats insert into the InstanceStats table so the two can be properly joined in your analysis queries. This design allows me to monitor multiple instances on the same machine and have the counters all properly synchronized when necessary. The @ServerID parameter in the insServerStats procedure is an output parameter, and by assigning the output from the first invoke-sqlcmd cmdlet to a variable called $res (for 'results'), I can get the value by examining the ServerID property of the $res variable.

    	#Send the machine counters to our database
    	$q = "declare @ServerID int; exec [Analysis].[insServerStats]"
    	$q = $q + " @ServerID OUTPUT"
    	$q = $q + ", @ServerNm='" + [string]$mach + "'"
    	$q = $q + ", @PerfDate='" + [string]$dtm + "'"
    	$q = $q + ", @PctProc=" + [string]$ppt
    	$q = $q + ", @Memory=" + [string]$mab
    	$q = $q + ", @PgFilUse=" + [string]$pfu
    	$q = $q + ", @DskSecRd=" + [string]$drs
    	$q = $q + ", @DskSecWrt=" + [string]$dws
    	$q = $q + ", @ProcQueLn=" + [string]$pql
    	$q = $q + "; select @ServerID as ServerID"
    	#write-output $q
    	$res = invoke-sqlcmd -ServerInstance $sqlsrv -Database $destdb -Query $q
    	$SrvID = $res.ServerID
    			
    	#Send the instance counters to the database
    	$q = "declare @InstanceID int; exec [Analysis].[insInstanceStats]"
    	$q = $q + " @InstanceID OUTPUT"
    	$q = $q + ", @ServerID=" + [string]$SrvID
    	$q = $q + ", @ServerNm='" + [string]$mach + "'"
    	$q = $q + ", @InstanceNm='" + [string]$instnm + "'"
    	$q = $q + ", @PerfDate='" + [string]$dtm + "'"
    	$q = $q + ", @FwdRecSec=" + [string]$frs
    	$q = $q + ", @PgSpltSec=" + [string]$pss
    	$q = $q + ", @BufCchHit=" + [string]$bch
    	$q = $q + ", @PgLifeExp=" + [string]$ple
    	$q = $q + ", @LogGrwths=" + [string]$log
    	$q = $q + ", @BlkProcs=" + [string]$blk
    	$q = $q + ", @BatReqSec=" + [string]$brs
    	$q = $q + ", @SQLCompSec=" + [string]$cmp
    	$q = $q + ", @SQLRcmpSec=" + [string]$rcm
    	$q = $q + "; select @InstanceID as InstanceID"
    	#write-output $q
    	$res = invoke-sqlcmd -ServerInstance $sqlsrv -Database $destdb -Query $q
    	}
    

    After running this script for each of your perfmon log files you'll be able to query that data, do aggregate analysis, and more easily find where you're having performance problems, as well as get to know what your baseline is.

    Allen

  • Setup Perfmon with PowerShell and Logman

    In a session yesterday I was reminded of the power of a tool on every server, Logman, to manage performance data gathering. It's a pretty amazing tool, and I'm starting to get to know how much more it can do than I've even considered, but here's a simple example of what I've done so far.

    First things first, I'm going to leverage the SMO Server object to get the default backup directory, so I can put the perfmon output into a directory where I know it's going to be backed up. The Server.Settings.BackupDirectory property has the full path for the default backup location, so let's grab that.

    $inst = 'SQLTBWS\INST01' # I get this from a command-line argument, but hard coded it for completeness
    $s = new-object ('Microsoft.SqlServer.Management.Smo.Server') $inst
    $bkup = $s.Settings.BackupDirectory
    

    Next, I'm going to check to see if a subdirectory called SQLPerf under that backup directory exists, and if not, create it.

    # Create the evaluation results directory if it doesn't exist
    if (!(Test-Path -path "$bkup\SQLPerf\"))
    	{
    	New-Item "$bkup\SQLPerf\" -type directory | out-null
    	}
    

    Because I get the instance name from a script argument, I don't know if it's a default or named instance. This matters when defining perfmon counters to capture, so we'll "Split" on the backslash character, and if there isn't a second object after the split, the instance is a default instance. In that case the $sqlcntr variable is set to "SQLServer", but for a named instance we need to set $sqlcntr to the instance service name, which is "MSSQL$xxxxx", where xxxxx is the instance name after the backslash. Additionally store the machine name in the $boxnm variable so I can create the output files using that name.

    # Get the instance name to determine if it's a default instance
    $srv = $inst.Split("\")
    if ($srv.Count -eq 1) {
    	$sqlcntr = 'SQLServer'
    	$boxnm = $inst
    	}
    	else {
    	$sqlcntr = 'MSSQL$' + $srv[1]
    	$boxnm = $srv[0]
    	}
    

    Next, I'll create a 'here-string' containing the configuration properties to send to Logman. Those properties consist of a list of the counters I want to capture. The only thing that's "special" here is the construct "$($sqlcntr)", which allows PowerShell to replace that construct with the counter service name, and allows us to specify valid counters for perfmon. Once we've created the string in the $cfg variable, we can pipe that to the out-file cmdlet, directing it to the machine name config file in the SQLPerf directory under the default backup directory.

    $cfg = @"
    \Processor(_Total)\% Processor Time
    \Memory\Available MBytes
    \Paging File(_Total)\% Usage
    \PhysicalDisk(_Total)\Avg. Disk sec/Read
    \PhysicalDisk(_Total)\Avg. Disk sec/Write
    \System\Processor Queue Length
    \$($sqlcntr):Access Methods\Forwarded Records/sec
    \$($sqlcntr):Access Methods\Page Splits/sec
    \$($sqlcntr):Buffer Manager\Buffer cache hit ratio
    \$($sqlcntr):Buffer Manager\Page life expectancy
    \$($sqlcntr):Databases(_Total)\Log Growths
    \$($sqlcntr):General Statistics\Processes blocked
    \$($sqlcntr):SQL Statistics\Batch Requests/sec
    \$($sqlcntr):SQL Statistics\SQL Compilations/sec
    \$($sqlcntr):SQL Statistics\SQL Re-Compilations/sec
    "@
    $cfg | out-file "$bkup\SQLPerf\$boxnm.config" -encoding ASCII
    

    Now we've got the config file that logman needs to configure perfmon to our specifications. I call the collection name 'SQLPerf', and you can see the command line options in the TechNet article here. Once the collection is created we run logman again and issue the start command on the collection.

    logman create counter SQLPerf -f bincirc -max 100 -si 60 --v -o "$bkup\SQLPerf\$boxnm" -cf "$bkup\SQLPerf\$boxnm.config"
    logman start SQLPerf
    

    Now, we don't want this to run forever, so once a day (usually at about 5am) I stop perfmon, move the counter data to a dated subdirectory, then start up perfmon again.

    # Stop Perfmon Counter log
    logman stop SQLPerf
    
    # Get the current date/time and create the dated subdirectory
    $dt = get-date -format yyyyMMddHHmmss
    New-Item "$bkup\SQLPerf\$boxnm$dt" -type Directory | out-null
    
    # Move trace files into a dated subdirectory
    Move-Item ("$bkup\SQLPerf\$boxnm*.blg") ("$bkup\SQLPerf\$boxnm$dt")
    
    # Start Perfmon Counter log
    logman start SQLPerf
    

    These steps allow me to create a consistent method of gathering performance data where the gathering process can't directly update my analysis database.

    Allen

  • Getting Job History Correctly

    In my last blog post I walked through a way to grab the duration from the EnumHistory method of the JobServer/Job object. As I worked through getting a solution in place for a client I found that the duration calculation was only part of the problem.

    First, when EnumHistory() is invoked, it runs the sp_help_jobhistory stored procedure. You can create a filter object which will supply the filters you want to the results, but I didn't do that. Given that it's invoked in my script for each job on a server, what I found, surprisingly, was that for each job it called the stored procedure, sending the results into a temporary table, and then selects from that temp table the rows where the job ID is that of the current job. So, it's returning the results for ALL jobs, every time, for each job. If you've got a lot of jobs on your server, that's a lot of data it's returning unnecessarily.

    Another problem I found is when I want to get all history that occurred since a given point in time. The sp_help_jobhistory procedure has parameters to select start date and start time, so it would seem that populating those values would yield the desired results, right? No, it filters them as separate columns, and what you'll get are the history rows where the job was run on or since the specified date, but only those rows where the start time of the job occurred after the specified start time, on any qualifying date. So, if the start date is February 22, 2012 at 22:00, then a job that ran on February 27, 2012 at 17:00 would not appear in your result set.

    (By the way, there's also parameters to the sp_help_jobhistory procedure for JobID and for JobName. Why the EnumHistory method didn't bother to use these is beyond me.)

    In any event, to get the history that occurred since a point in time, in PowerShell invoke the get-date cmdlet, once for the date and once for the time, casting the time as an integer variable, like this (where $gatherdt contains the date/time for your starting point):

    $strtdt = get-date $gatherdt -format yyyyMMdd
    [int]$strttm = get-date $gatherdt -format HHmmss
    

    Then you can call the sp_help_jobhistory procedure, like this:

    $q = "exec msdb.dbo.sp_help_jobhistory @mode='FULL',@oldest_first=1,@start_run_date='$strtdt'"
    $jobhist = invoke-sqlcmd -ServerInstance $inst -Database 'msdb' -query $q
    

    You'll get history for all jobs, but only since 00:00 on the starting date. You can then test the results against the start time by ensuring that on the start date, you only process the rows received after the start time like this:

    #Now extract from the job history the results of the jobs run since the last gather date
    [int]$strtdt = get-date $gatherdt -format yyyyMMdd
    $jobhist | foreach {
    	# Filter data occurring before last gather date
    	$ignore = $True
    	$jdt = $_.run_date
    	$jdtt = $_.run_time
    	if (($jdt -gt $strtdt) -or (($jdt -eq $strtdt) -and ($jdtt -gt $strttm))) {
    		$ignore = $False
    		}
    

    (I recast the start date as an integer because that's how the data is returned from the proc.) For each row I set the $ignore variable to $True, get the incoming date and time, then test to see that the date is greater than the start date or that the date is equal to the start date and the time is greater than the start time. If it passes those tests I set $ignore to $False, and that allows me to then process then incoming row.

    Allen

  • Handle Duration Results from EnumHistory in PowerShell

    Lately I've been very busy with client work, and one of my projects is to capture the history for SQL Agent jobs across all their servers so they can evaluate when to schedule jobs. (Yes, I know SQL Sentry has an excellent product for this, but right now there are budget constraints.) Anyway, I can do this in PowerShell with SMO.

    Once you connect to a server in SMO, the JobServer object has a collection called Jobs. For each job in that collection, you can call the EnumHistory method and it will return both job and job step results, including duration.

    On the Technet page describing the datatable returned when calling EnumHistory, it indicates that the Duration property is the duration of the job or step in seconds. Unfortunately this is blatantly incorrect.

    Let's say a long running job returns a duration value of 93413. Well, converting seconds to hours, minutes and seconds returns 1.01:56:53 (one day, 1 hour, 56 minutes and 53 seconds). But if you look in the View History window for the job in SQL Server Agent you'll see that the job ran in 9 hours, 34 minutes and 13 seconds. Yes, the duration of 93413 translates to 9:34:13.

    I'm not sure who made the decision to record the time this way, because parsing it back out so that it's useful is problematic. I need to get this done, so I wrote this PowerShell function for just that purpose.

    # Function to turn the Duration from EnumHistory() bigint into useable time value
    Function Convert-Duration {
    param ([int]$sec)

    #Now break it down into its pieces
    if ($sec -gt 9999) {$hh = [int][Math]::Truncate($sec/10000); $sec = $sec - ($hh*10000)}
    else {$hh = 0}
    if ($sec -gt 99) {$mm = [int][Math]::Truncate($sec/100); $sec = $sec - ($mm*100)}
    else {$mm = 0}

    #Format and return the time value
    $dur = ("{0:D2}" -f $hh) + ':' + "{0:D2}" -f $mm + ':' + "{0:D2}" -f $sec
    $dur
    }

    The function takes an integer value as a parameter, and returns a string formatted to be useful as a time string, so this code:

    $sec = 93413
    $dur = Convert-Duration $sec
    $dur

    returns a result of 09:34:13, which I then use to pass to a stored procedure parameter defined as a datatype of TIME.

    I had to add the

    [Math]::Truncate
    directive on the division because PowerShell rounds when doing integer division, and I needed it to just truncate fractions.

     

    I hope that helps you, because with that function my job is much easier.

    Allen

  • [Speaking] Manage SQL Server 2012 on Windows Server Core with PowerShell

    This evening I'll be presenting a brand new session on installing and managing SQL Server 2012 with PowerShell. Here's the abstract:

    Windows Server 2008 introduced Server Core, the operating system without the Graphical User Interface. SQL Server 2012 is the first version of SQL Server to support Server Core. To gain the maximum efficiency from your servers you want to make use of Server Core and with PowerShell you can effectively manage SQL Server in that environment. This session will demonstrate how to manage SQL Server 2012 on Windows Server Core and provide scripts to help you perform the common tasks you need to perform on that platform.

    Please join us at the Triangle SQL Server User Group meeting in Raleigh, NC for this first look at making your servers more efficient.

    I look forward to seeing you there.

    Allen

  • T-SQL Tuesday #25 Followup - Just in Time for the Holidays

    T-SQL TuesdayT-SQL Tuesday #25 fell on the same day I brought my mother home from a 3-day emergency trip to the hospital. She's much better, now, but it was a bit crazy there for a little bit. Thanks for everyone's best wishes.

    Rob Farley kicked off the official start of the event with a post titled A T-SQL Tip: Working calculations. He showed us how to use the CROSS APPLY join operator to facilitate date calculations.

    Greg M Lucas followed up with a list of Management Studio shortcuts (not exactly T-SQL, but useful nonetheless) here.

    Pinal Dave (aka the SQL Authority) contributed a few scripts to help you manage your SQL Server configuration with his post SQL SERVER - A Quick Look at Performance - A Quick Look at Configuration.

    We love comic books and super heroes, and Tracy McKibben (RealSQLGuy) fed that love with a couple of links to his performance capture scripts and to Adam Machanic's WhoIsActive script here.

    In her first ever T-SQL Tuesday post, Mala Mahadevan shared with us a number of scripts that provide useful information and automate some processes here.

    Sam Vanga's Tips and Tricks shared with us a way to clear the windows clutter of Management Studio and focus on the T-SQL here.

    On Chris Shaw’s Weblog we learn how to read the SQL Server error logs via T-SQL and manage them across many servers here.

    Using a method called "selective aggregates", Kent Chenery shows us how to efficiently summarize multi-year aggregate data here.

    Ricardo Leca shared a script to create mirrored databases in T-SQL and making use of SQLCMD here.

    A former Clevelander living in Washington, DC, Matt Velic shares a trick to make building referential integrity in databases easier in Easy Integrity.

    Jason Strate showed us a script that you can use to show the various execution plans using a specific index here. In some performance work I'm doing for a client right now this script will really come in handy.

    One of my favorite new people on the SQL Server scene, Jes Schultz Borland published a script to show how much transaction log space is used here.

    Bob Pusateri demonstrates how to concatenate the returned rows from a query into a single string without using a cursor here.

    My good friend John Sterrett has a nifty script to show us what queries are currently running right now in this post.

    John Pertell keeps track of table growth with the scripts in this post.

    Dev Nambi takes advantage of the INTERSECT operator to show how to ensure that query changes made for performance return the same result set here. I'm adding this one to my toolkit right away!

    Some folks, like Richard Lewis (@gogorichie) posted links to earlier posts with their favorite tricks here.

    David Howard points out some of the tricks in using the TOP keyword here.

    One of my friends from the Cleveland group, Colleen Morrow, talked about organizing Registered Servers and how to run queries against multiple servers simultaneously using the groups in the Registered Servers tab in Management Studio here.

    Aside from saying some very nice things about my presentations, Matt Nelson supplied a nifty script to return date range values via a view here.

    Sebastian Meine talks about transaction log reuse waits, and how to determine why the log keeps growing here.

    Nic Cain provides us a script that you can use to find out what Reporting Services reports have failed here.

    Steve Wales also provided links to earlier posts here.

    Jason Brimhall talked about some of the built-in property features in T-SQL here.

    Aaron Bertrand also showed us how to be more productive in Management Studio with his tips here.

    Rob Volk shares ways we can more easily use constraints in T-SQL here.

    Mark Broadbent introduces us to The Best Tool You’ve Never Used. (Not exactly a T-SQL tip, but cool nonetheless.)

    Another fellow Clevelander, Steve Smith, shares a grab bag full of cool tricks here.

    Per the standard rules of T-SQL Tuesday, this ends the event, but there were a few stragglers that came in late:

    Nigel Sammy: here.

    Steve Jones: here.

    Jason Grobe: here.

    Thanks to everyone who participated. I definitely learned some great new ways to work with SQL Server.

    Allen

  • Speaking about Performance and PowerShell

    I'll be presenting at the South East Michigan SQL Server User Group (SEMSSUG) tonight. The session will be Gather SQL Server Performance Data with PowerShell.

    We all know how important it is to keep a baseline of performance metrics that allow us to know when something is wrong and help us to track it down and fix the problem. We don't always know how to do this easily and consistently. This session will walk you through a series of PowerShell scripts you can schedule which will capture the most important data and a set of reports to show you how to use that data to keep your server running smoothly.

    You can register for the meeting via this link: http://semssug20111208.eventbrite.com/

    The meeting will be at the Raymond James Southfield Offices, 25900 Telegraph Rd, Southfield, MI 48033. (Actual entrance is on Civic Center Drive, just 100 meters east of Telegraph Rd.) Directions found by clicking here.

    The meeting starts at 6pm and runs until 8pm, and DBSophic is the sponsor, providing the pizza.

    I hope to see you there.

    Allen

  • T-SQL Tuesday #025 - Invitation to Share Your Tricks

    T-SQL TuesdayIt doesn't seem that long ago that having cool little tidbits of information about SQL Server made a huge difference in how effective you could be. Well, that's still true, but let me give you an example.

    SELECT name FROM sysobjects WHERE sysstat & 4 > 0

    In the early days of SQL Server, this was the way to pull a list of the names of all the stored procedures in your database. The 4 bit in the sysstat column represented stored procedures. (1 represented user tables and 2 represented view, as I recall, so changing the WHERE clause to read WHERE sysstat & 7 > 0 returned all tables, views and stored procedures.)

    As SQL Server has evolved, Microsoft has made it easier to query the metadata to determine what objects existed, adding columns that helped (like 'Type' in this case), catalog views, Dynamic Management Objects, etc.

    So, the challenge for this month's T-SQL Tuesday is: What T-SQL tricks do you use today to make your job easier? (Notice I didn't say PowerShell - I have a bunch of those now, but this is T-SQL Tuesday, not PowerShell Tuesday.)

    Here are the rules:

    1. Your post must go live between 00:00:00 GMT on Tuesday December 13 and 00:00:00 GMT on Wednesday December 14.
    2. Your post must link back to this post, and the link must be anchored from the logo (found above) which must also appear at the top of your post.
    3. Leave a comment or a trackback here on this blog regarding your post.
    4. 'T-SQL Tuesday #025' should be included in the title of the post.
    5. If you're on Twitter, tweet about your post using the hash tag #TSQL2sDay

    I'm looking forward to reading what goodies you're all going to share. After the event closes I'll post a summary of all your contributions.

    Allen

This Blog

Syndication

Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement