THE SQL Server Blog Spot on the Web

Welcome to - The SQL Server blog spot on the web Sign in | |
in Search

Allen White

  • [Speaking] PowerShell at the PASS Summit

    Next week is the annual PASS Summit, the event of the year for those of us in the SQL Server community. We get to see our old friends, make new friends, and learn an amazing amount about SQL Server, and it'll be in Seattle, so it's close to the mother ship. I love having Microsoft close, because it's easier to get to know the people who actually make this amazing product we spend our lives working with.

    This year I'm fortunate to have been selected to present three sessions. One is a regular session called Manage SQL Server 2012 on Windows Server Core with PowerShell, where I'll be showing you how to set up and install SQL Server 2012 on Windows Server 2008 R2 and Windows Server 2012 Server Core, with some cool things that make that setup easy to manage. I'm also doing a Spotlight session called Maintain SQL Server System and Performance Data with PowerShell, where I'll show you how to create a database to contain the inventory of the servers you manage, and how to gather performance metrics, all with PowerShell.

    On Tuesday, I'll also be doing a pre-conference session all day called Automate and Manage SQL Server with PowerShell. I truly enjoy spending the day with this session. We start out with an introduction to PowerShell, because you really need some basics before the more advanced topics make sense. This is not a comprehensive PowerShell course, because that would take days, but the introduction will give you enough to truly get started.

    Once we get the basics down, we'll dive into Server Management Objects (SMO), because the SQL Server team built this interface specifically for managing SQL Server. You may love SMO or you may hate it, but understanding its structure will help you automate your tasks as you manage your environment. We'll get into the various activities a DBA is expected to manage and walk through scripts that solve those problems. As the day goes on we'll get into managing Central Management Server and Policy-Based Management, and we'll cover the topics in the other sessions I'll be doing on Wednesday. We'll also see how much more important PowerShell is in Windows Server 2012 than ever before.

    If you haven't sensed the excitement about the upcoming PASS Summit, check it out, and I hope to see you there!


  • Search For a Query in RDL Files with PowerShell

    In tracking down poorly performing queries for clients I often encounter the query text in a trace file I've captured, but don't know the source of the query. I've found that many of the poorest performing queries are those written into the reports the business users need to make their decisions. If I can't figure out where they came from, usually years after the queries were written, I can't fix them.

    First thing I did was find a great utility called RSScripter, which opens up a Windows dialog that allows you to connect to a Reporting Services server and script the reports to RDL source files in a directory you specify. By exporting the RDL files from the Reporting Services server I can search through and find the queries I need to find.

    The next thing I do is set a string variable to contain the query in question, which I do here using a here-string:

    $q = @'SELECT  d.SalesOrderID,
    FROM    Sales.SalesOrderDetail d
            INNER JOIN Sales.SalesOrderHeader h ON d.SalesOrderID = h.SalesOrderID
            INNER JOIN Sales.SpecialOffer o ON d.SpecialOfferID = o.SpecialOfferID
    WHERE   d.SpecialOfferID <> 1'@

    Now, when RSScripter runs it pulls out everything from RS, and I'm only interested in the RDL files so I use the Get-ChildItem cmdlet and pipe the output through the Where-Object cmdlet to select just those files whose extension is '.rdl', and assign that to a variable.

    $rdls = Get-ChildItem | where-object {$_.Extension -eq '.rdl'}

    Now I have a collection of RDL file objects, which I can iterate through with the foreach command. For each one I'm going to display the file name, then use the Get-Content cmdlet to read the contents of the RDL file, and pipe that to the Select-String cmdlet to look for the query string. If a match is found it'll display that line after the file name.

    foreach ($rdl in $rdls) { write-output $rdl.Name; get-content $rdl.FullName | select-string $q }

    While this may be a little crude, it's kind of an ad-hoc way of finding where a query comes from. There are recursive options for Get-ChildItem that'll search the entire subdirectory tree, but for now I didn't need that.

    I hope this helps you find some of those troublesome queries you're struggling with.


  • Scanning the Error Log with PowerShell

    One of the most important things you can do as a DBA is to keep tabs on the errors reported in the error log, but there's a lot of information there and sometimes it's hard to find the 'good stuff'. You can open the errorlog file directly in a text editor and search for errors but that gets tedious, and string searches generally return just the lines with the error message numbers, and in the error log the real information you want is in the line after that.

    PowerShell 2.0 introduced a new cmdlet called Select-String which searches through a text file and returns the lines matching the target string. What it adds is an command-line parameter [-context] which allows you to specify the number of lines before and after the match is found. Bingo! By specifying the parameter -context 0,1 with the Select-String cmdlet it returns the line with the error, and the line afterwards.

    To see what this does, first navigate to the errorlog directory, then issue the following commands:

    $errlog = Get-Content '.\ERRORLOG'
    $errlog | Select-String  -pattern 'Error:' -context 0,1

    Now that's really useful, but it does mean that I have to log on to each server to get the errors, or I have to somehow know where every server's errorlog directory is and connect through the network to get to the files. Not so nice if you want to work from a client and 'just get the errors'.

    Using SMO, there's a method under the Server object called ReadErrorLog() which reads the error log and (if assigned to a variable) creates a DataRow object containing the LogDate, ProcessInfo and Text properties from the errorlog. The ProcessInfo property contains the spid or other identifying value to indicate the source of the problem. By connecting to an instance using the Server object and loading the error log into the DataRow object, we can then get similar results, but we have to add an additional parameter, -inputobject, which lets us specify the property to search for our string.

    $svr = new-object ('Microsoft.SqlServer.Management.SMO.Server') MyServer
    $err = $svr.ReadErrorLog()
    $err | Select-String -inputobject {$_.Text} -pattern 'Error:' -context 0,1

    The advantage of this solution is that I don't log into the target server and I don't have to know where the log file is. The downside is that we lose the LogDate and ProcessInfo properties altogether, and there's no way to piece the parts back together. (Now, as sure as I say this Chad Miller will pipe up with a cleaner way to do this. This is good, and it's how I learn.)

    What I found that works, though it's a tad clunky for my taste, is to concatenate the properties back together before sending it to the Select-String cmdlet, like this:

    $errlog = @()
    $err | foreach { $errlog += [string] $_.LogDate + ' ' + $_.ProcessInfo + ' ' + $_.Text }

    The problem with this approach, of course, is you have to pass through the errorlog twice. To minimize this, I added an option to only report the errors starting with a particular date.

    $errlog = @()
    $err | where {$_.LogDate -ge $startdt} | foreach {
    	$errlog += [string] $_.LogDate + ' ' + $_.ProcessInfo + ' ' + $_.Text

    This minimizes the issue, and if you're properly cycling the error log (normally once a week) the amount of log data it has to cycle through twice isn't too bad. With the properties concatenated we then return to the original search command to get our results.

    $errlog | Select-String  -pattern 'Error:' -context 0,1

    But wait, there's more! After playing with it for a bit I realized I'd like an option to just return DBCC results. Unlike errors, DBCC results are reported on the same line as the DBCC reports it was run, so I don't need the line afterward. In my script I added a variable called $srch to the command line arguments, and if it's equal to 'DBCC' it just returns the DBCC lines.

    # Search the errorlog and return any error and the subsequent detailed message
    if ($srch -eq 'DBCC') {
    	$errlog | select-string -pattern 'DBCC' -context 0,0
    else {
    	$errlog | select-string -pattern 'Error:' -context 0,1

    And that's it! I named the script scan-errorlog.ps1, and it takes the instance name, start date and (optionally) 'DBCC' as parameters, and returns either errors for that server after the start date, or DBCC results since the start date.

    ./scan-errorlog.ps1 MyServer 9/25/2012

    That command line will return any errors reported in the errorlog since midnight last night.

    It's really fun to find gems like this that make our lives as a DBA much easier.


  • 24 Hours of PASS - PowerShell 101 for the SQL Server DBA

    Thanks to everyone who came out for the session today.  It was a lot of fun and I hope you had a great experience as well.  I've attached the demo scripts and slide deck to this post, and I look forward to seeing you in November at the PASS Summit!


  • Speaking - 24 Hours of PASS, Summit Preview Edition

    There's so much to learn to be effective with SQL Server, and you have an opportunity to immerse yourselves in 24 hours of free technical training this week from PASS, via the 24 Hours of PASS event.

    I'll be presenting an introductory session on PowerShell called PowerShell 101 for the SQL Server DBA. Here's the abstract:

    The more you have to manage, the more likely you'll want to automate your processes. PowerShell is the scripting language that will make you truly effective at managing lots of servers. But it's more than just a scripting language - it's an interactive shell that stores data for you and allows you to implement ad-hoc solutions quickly and easily. Within the PowerShell environment you can easily manage both SQL Server instances and the Windows servers themselves, giving you a 'best of both worlds' environment that puts you in control. This session will introduce you to PowerShell and show you how to use it to manage SQL Server across many instances.

    I look forward to seeing you there.


  • Speaking - Red Gate's SQL in the City

    The great folks at Red Gate have invited me to join the festivities at the SQL in the City events in both Chicago on October 5, and in Seattle on November 5.

    In both cities I'll be presenting a session entitled Automated Deployment: Application And Database Releases Without The Headache. Here's the abstract:

    Ever since applications were first created, the deployment of updates and changes has been a headache, with the potential of disruption of the application at best and data corruption at worst. Getting the steps of any deployment right are critical to the success, and if there's a problem it's even harder. Data structures depend on code, and code depends on data structures. In this session we'll look at how deployment has been done in the past, the problems encountered, and we'll look at some ways to mitigate the risk inherent in application deployment, including Red Gate's new Deployment Manager tool, which helps you streamline the whole process.

    There is a great lineup of speakers at all the events, including my good friends Steve Jones (b | t) and Grant Fritchey (b | t), who both work for Red Gate, but also people like Brent Ozar (b | t), Jes Borland (b | t), Buck Woody (b | t) and Adam Machanic (b | t), all good friends and great speakers.

    You can sign up for the Chicago event here and for the Seattle event here. I look forward to seeing you!


  • T-SQL Tuesday #34: Help me, Obi-Wan Kenobi, You're My Only Hope!

    T-SQL TuesdayThis T-SQL Tuesday is about a person that helped you understand SQL Server.

    It's not a stretch to say that it's people that help you get to where you are in life, and Rob Volk (@sql_r) is sponsoring this month's T-SQL Tuesday asking who is that person that helped you get there.

    Over the years, there've been a number of people who've helped me, but one person stands out above the rest, who was patient, kind and always explained the details in a way that just made sense.

    I first met Don Vilen at Tech Ed 2006 in Boston. I'd seen him present at the PASS Summit a couple of times, but at that Tech Ed I'd started to get to know people and felt comfortable just hanging out in the SQL Server area, and we started talking. He was easy going, and in talking about issues at work he explained how things worked in a way that no one else had (at least to me). As a side note, it was at that event that he introduced two of his friends to each other: Kimberly Tripp and Paul Randal.

    At subsequent conferences I made a point to see his presentations, no matter the subject. His explanation of how database snapshots work was the clearest I've ever seen, and used in many classes I taught since. I always enjoyed his company and conversation with him was lively and entertaining, regardless of the subject. I was sad to find out shortly after I received my first MVP Award in July 2007 that he was leaving Microsoft. He joined Scalability Experts as a Principal Architect that fall.

    In July 2008, through a strange set of circumstances, I joined Scalability Experts myself, as a trainer. Since there were more consulting engagements than training assignments at first, I was assigned to a project, and the architect assigned to the project was none other than Don. He mapped out what we had to do and explained some aspects of SQL Trace I'd never used, and I was able to create a significant load test out of a relatively small trace file with his help.

    The following spring another assignment put me together with Don evaluating different server hardware using the same workload, and with Don's help I was able to get a much better understanding of baselines and performance metrics, and I still use that knowledge in client engagements and in my presentations today.

    The last time we worked together was for a major financial services company in New York City, and Don led a team of us to put together a Center of Excellence plan for the company, to help their staff become the best in the business with respect to their use of SQL Server. My focus was on the Database Developers, and with Don's help I built presentations and documentation on all aspects of developing fast and efficient queries and processes to run against SQL Server. The environment was both high profile and highly stressful, and Don always kept his cool and was always helpful.

    After I left Scalability, my daughter and son-in-law were flying back to the US from Germany, where my son-in-law was stationed in the Army, when a blizzard hit the east coast and they shut down the airports after their plane landed at JFK. Because of the conditions there were no hotels, and the next flight the kids could get out wasn't until four days later. I called Don, who had an apartment not far from Central Park, and he was kind enough to put the kids up for the night. I drove to New York and picked them up there and brought them home the next day.

    There've been great people who've helped me before and since, but no one stands out like Don. Thank you, Don, for everything.


  • Your Presence Matters at the PASS Summit

    This year will be my tenth year attending the annual PASS Summit. It's in Seattle again this year, which to me is important because of the proximity to the Microsoft offices and the people on the SQL Server dev team, as well as the CX (formerly CAT) teams that help so many people get the most out of SQL Server.

    The conference is the biggest event in the world specifically focused on SQL Server. As a result, it's an opportunity to meet many people who are directly focused on the SQL Server platform, and those individuals have created a unique community around PASS and the PASS Summit. It's within that community where the relationships develop that help you solve the problems you encounter every day when working with SQL Server. The key word here is relationships. Technology is always changing, but those relationships will help you navigate that changing technology, because those people are going through the same things you are.

    It's not just attendees that benefit from those relationships, though. I've been fortunate to have been selected to speak at the Summit each year since 2006. As I've gotten to know the other speakers those acquaintances have become friendships, and some of those friendships have flourished, becoming personal friendships, which go far beyond SQL Server. I look forward to the big events because I get to see my friends again, and we all get to hang together and do some crazy things like Karaoke.

    More importantly, though, is that I get the opportunity to learn from others. No one person knows it all. It's not possible; SQL Server is just too big. Even with topics that I know really well, I’ve discovered that there's always some new aspect that I have never encountered, that I learn from others. For example, when I was at SQL Rally in Dallas I attended a basic SQL Server backup session given by Sean McCown. I've been doing SQL Server backups for 20 years, but I learned something in that session. No matter who you are, and how much you know, you can learn something from others.

    For those reasons it's important for you to be there. You'll learn something from the people who are there. We'll learn something from you. We'll get to spend some time together, and maybe get to know each other a little better. We're all better off for it, and this is how we build our community.

    So no matter if you're a brand, spanking new DBA, or the person who wrote the query processor, please make a point to come to the PASS Summit. Let's spend some time together, have a little fun, learn something, and mostly, make the community a little bit stronger.

    I look forward to seeing you there.


  • SQLCruise Alaska was Amazing

    You'd think that providing in-depth SQL Server training on a cruise ship would be an excuse for a vacation disguised as a business trip, but you'd be wrong. This past week I traveled with the founders of SQLCruise, Tim Ford and Brent Ozar, along with other top professionals in the SQL Server world - Jeremiah Peschka, Kendra Little, Kevin Kline and Robert Davis - and me.

    The week began with Brent presenting a session on Plan Cache Analysis, which I plan to start using very soon. After Brent, Kevin presented a four-hour leadership training session. He's recorded a set of training videos which will really help IT professionals make the transition to management.

    After spending the day snorkeling in Ketchikan, Alaska with Snorkel Alaska, I presented a short session on SQL Sentry's Plan Explorer and announced Wayne Sheffield as the winner of the Power Suite from SQL Sentry for having the "hairiest execution plan". SQL Sentry paid for my cruise and travel as their representative, and I'm really grateful to them for the opportunity. They have a great suite of products, and they're great people, and that makes me proud to recommend them, as well as to represent them. Following the contest, Jeremiah did an interesting session called "Killer Techniques for Database Performance" where he went into really great detail about the real cost of accessing data in applications.

    Tuesday was mostly a sightseeing day, as we spent the day in Juneau, where I did a tour called "Glacier View Bike and Brew", put together by Cycle Alaska. A van picked us up and took us north to the University of Alaska Southeast on Auke Lake, and we rode nine miles on a bike to Mendenhall Lake where we got great views of Mendenhall Glacier, then on to the glacier visitor center. Then they drove us into Juneau to a pub where we had a beer tasting of the different blends from the Alaskan Brewing Company. We reboarded the ship and cruised out to Sawyer Glacier to see that spectacular site, and on the way participated in a storytelling session in Brent's suite. Some great stories were told, and Darcy Williams won an Amazon Kindle Fire for the best IT story told.

    Wednesday was spent in Skagway, where Tim arranged a van tour up through the White Pass which the gold rush prospectors had to traverse to get to the fields near Dawson City in the Yukon Territory. We followed that up with lunch and beverages at the Skagway Brewing Company, or "BrewCo", as they call it locally.

    After taking advantage of sightseeing primarily for three days, we got back into the serious training on Thursday, where I did a shortened version of my PowerShell pre-con. It's hard to get a real comprehensive "PowerShell for SQL Server" session in four hours, but I think I covered it well enough to get people familiar enough to get real work done, and showed some advanced features to show them what they can do once they get up to speed. After my session, Kendra did a session called "Sports Medicine For High Transaction Databases" where she talked about benchmarking and methods to help your high volume transactional systems to perform under heavy load. The day ended with Robert doing a session on "Recovering Data and Databases" where he talked about how data is stored, how it's backed up, and how to recover from corruption.

    The last day of the cruise started with Brent, this time talking about Scaling SQL with Solid State, where I learned that even putting solid state drives in place may not solve I/O subsystem performance problems, and how to plan for the best performance using them. We followed that up with the Idera Iditarod Challenge, where the cruisers had to answer SQL questions and perform additional challenges, and winners Malathi Mahadevan and Max Gomeniouk each won an Amazon Kindle Fire from Idera. We ended the day visiting Victoria, BC.

    The best part of the cruise, aside from the great training, was getting to know the cruisers and spending real time with them. It's often hard when working as a database professional because we seldom work in a team, and it can be difficult to know what questions to ask when presented with a problem. This training is unique because we spend time together, not just in class, but visiting the sights, and talking about SQL. After a week together we all have new people we can go to with questions and ideas, to see if they're valid or to get problems solved. I'm truly impressed with the work that Tim does in making the cruise a success for everyone, and I hope you'll consider signing up for one of next year's cruises. Check it out at


  • SQL Rally Presentations

    As I drove to Dallas for this year's SQL Rally conference (yes, I like to drive) I got a call asking if I could step in for another presenter who had to cancel at the last minute.  Life happens, and it's best to be flexible, and I said sure, I can do that. Which presentation would you like me to do? (I'd submitted a few presentations, so it wasn't a problem.)

    So yesterday I presented "Gathering Performance Metrics With PowerShell" at 8:45AM, and my newest presentation, "Manage SQL Server 2012 on Windows Server Core with PowerShell" at 1PM at the conference here in Dallas.  The room was packed for both sessions and I truly appreciate every one of you who attended and there were great questions in both sessions.

     I've attached a zip file containing the slide decks from both presentations along with the demo scripts.  I hope you get as much out of using PowerShell with SQL Server as I have, and Thanks!


  • Script and Migrate Agent Jobs between Servers using PowerShell

    I'm nearing the end of a SQL Server upgrade project, migrating from SQL Server 2000 to SQL Server 2008 R2. The company uses SQL Server Agent to run about 150 jobs which execute DTS packages to do the work of moving data around. Now, there are any number of ways to move these jobs from one server to the other, but I wanted to be able to exclude certain jobs, make some minor changes before loading them onto the new server, that kind of thing. PowerShell makes that relatively easy to do.

    First, most of the jobs were owned by some login, many of which represent people no longer with the company. This happens over time, but should be cleaned up. To quickly modify the jobs to all be owned by 'sa', it's just a few lines of code:

    # Connect to the instance using SMO
    $s = new-object ('Microsoft.SqlServer.Management.Smo.Server') $inst
    # Iterate through the jobs to reset the job owner
    foreach ($job in $s.JobServer.Jobs) {
    	if ($job.Owner -ne 'sa') {
    		$job.OwnerLoginName = 'sa'

    My scripts usually pass in $inst as the SQL Server instance name as a command-line argument, so I can use the script on any server. Once this script was done all the jobs are owned by sa, and I don't have to worry about enabling old logins on the new server. Next thing is to script out the jobs into .sql files, one for each job. In this case I pass in a base directory path as a second command-line argument. Then I use PowerShell to create the Jobs directory under that (note that if you're using a named instance, I'd recommend using the encode-sqlname cmdlet to turn the $inst variable into an encoded name to prevent the backslash from creating problems):

    # Create the Database root directory if it doesn't exist
    if (!(Test-Path -path "$base\$inst\Jobs\"))
    	New-Item "$base\$inst\Jobs\" -type directory | out-null
    $homedir = "$base\$inst\Jobs\"

    The next step is to set the properties for the script files:

    # Instantiate the Scripter object and set the base properties
    $scrp = new-object ('Microsoft.SqlServer.Management.Smo.Scripter') ($s)
    $scrp.Options.ScriptDrops = $False
    $scrp.Options.WithDependencies = $False
    $scrp.Options.IncludeHeaders = $True
    $scrp.Options.AppendToFile = $False
    $scrp.Options.ToFileOnly = $True
    $scrp.Options.ClusteredIndexes = $True
    $scrp.Options.DriAll = $True
    $scrp.Options.Indexes = $True
    $scrp.Options.Triggers = $True

    Then we can cycle through the jobs and create scripts for each job on the server.

    # Create the script file for each job
    foreach ($job in $s.JobServer.Jobs) {
    	$jobname = $job.Name
    	$scrp.Options.FileName = "$homedir\$jobname.sql"

    There are some jobs on the original server that won't move - things like maintenance plan jobs, which I'm replacing with more efficient methods, so having individual script files allows me to simply delete the script files for the jobs not moving to the new server. Once I've done that I have another script which will load the script file into a PowerShell variable, and using invoke-sqlcmd, create the job on the new server. Now, each script file ends with a 'GO' statement, and invoke-sqlcmd doesn't like that, so I want to strip it out, which is easy enough to do with the Replace method. The problem is that it also contains a GOTO command to bypass the error notification. I use the Replace() method to get around that by first replacing GOTO with OTOG, then replacing GO with an empty string, then replacing OTOG again with GOTO.

    The other 'tricky' thing I had to do was to convert the script file to a PowerShell string. I use the Get-Content cmdlet to pull the script into a PowerShell object. I then use the here-string construct with the PowerShell object inside the here-string, and that converts it from an object to a System.String object for me, and then I can easily supply that string as the query for invoke-sqlcmd.

    # Get all the files with a .sql extension in the directory and cycle through them
    $files = get-childitem $basedir | where {$_.Name -like '*.sql'}
    foreach ($file in $files) {
    	$fname = $file.Name
    	$sql = get-content "$basedir$fname"
    	$qry = @"
    	$qry = $qry.Replace('GOTO','OTOG')
    	$qry = $qry.Replace('GO','')
    	$qry = $qry.Replace('OTOG','GOTO')
    	invoke-sqlcmd -ServerInstance $inst -Database master -Query $qry

    Now the jobs are loaded on the new server, but of course the problem is that they're ACTIVE. Until we go live we don't want them running, so we need to disable them en masse. That's pretty simple as well.

    $jobs = $s.JobServer.Jobs 
    foreach ($job in $jobs) {$job.IsEnabled = $False; $job.Alter()}

    And now the jobs are all in place, and ready to be enabled once the green light is given on the migration.


  • System Inventory and Performance Gathering SQL Connections Session

    Yesterday I delivered a session on maintaining a system inventory and gathering performance data with PowerShell.  The session went well and I had great questions and I'd like to thank everyone who attended the session.  I've attached the slides and demo scripts from the session for your use.


  • PowerShell Precon session at SQL Connections

    Yesterday I had the privilege of presenting the full day training session SPR304-Automate and Manage SQL Server with PowerShell at the SQL Connections conference in Las Vegas.  The session went very well (at least from my perspective) and I think the attendees enjoyed it as well.

    Just the day before the session I got excited about some features of PowerShell I hadn't played with much and decided to add a discussion of them to the presentation, so the material the conference gave them doesn't include that material.  (I also had a minor amount of demo-fail, so I've corrected those scripts as well.)

    I've attached the presentation deck and all the demo scripts to this post, but please, don't ask me for the password. If you were in the room yesterday you know what it is, and as the beer commercial says, "this one's for you."

    Thanks for attending!


  • Speaking About SQL Server

    There's a lot of excitement in the SQL Server world right now, with the RTM (Release to Manufacturing) release of SQL Server 2012, and the availability of SQL Server Data Tools (SSDT). My personal speaking schedule has exploded as well. Just this past Saturday I presented a session called Gather SQL Server Performance Data with PowerShell.

    There are a lot of events coming up, and I hope to see you at one or more of them. Here's a list of what's scheduled so far:

    First, I'll be presenting a session called Automate Policy-Based Management Using PowerShell at the 24 Hours of PASS on March 21.

    The Policy-Based Management feature in SQL Server 2008 provides a great way to ensure your systems are configured consistently and correctly, but it can be tedious to implement on each server in your environment. PowerShell scripts let you automate the implementation of your policies so that you can focus on more important problems. This session walks you through how PBM works, how to define your policies in PowerShell, and how to set up Agent jobs that evaluate those policies regularly and let you know when you need to take action.


    The following week, I'm really excited to be one of the featured presenters at the SQL Connections conference in Las Vegas. This event is also one of the official launch events for SQL Server 2012. At this event I'll be presenting an all day workshop called SPR304: Automate and Manage SQL Server with PowerShell. With PowerShell becoming more and more a key tool in managing all server systems, this workshop will get you up to speed quickly, ready to take control of SQL Server.

    This soup-to-nuts all day workshop will first introduce you to PowerShell, after which you’ll learn the basic SMO object model, how to manipulate data with PowerShell and how to use SMO to manage objects. We’ll then move on to creating Policy-Based Management policies, work with the Central Management Server, manage your system inventory and gather performance data with PowerShell. We’ll wrap up with a look at the new PowerShell cmdlets introduced for SQL Server 2012 and how you can use PowerShell to manage SQL Server 2012 in server environments including the new Always On technology and Windows Server Core. After this one day, you’ll be ready to go to work and able to use PowerShell to make you truly effective.
    In addition, I'll be doing three regular conference sessions.
    SQL226: Scalable Application Design with Service Broker. How your business applications interact with the corporate database directly can have a dramatic impact on how successful you are, and one of the biggest issues is performance with large databases. By designing the application to use asynchronous messaging for non-critical updates you can focus performance concerns to where the timeliness is critical. SQL Server Service Broker handles the asynchronous requests quickly and efficiently, and minimizes the conflicts with your critical updates. This session will walk you through the steps to get Service Broker up and running and providing a consistent delivery of your business data.
    SQL325: Maintain SQL Server System and Performance Data with PowerShell. Maintaining a solid set of information about our servers and their performance is critical when issues arise, and often help us see a problem before it occurs. Building a baseline of performance metrics allows us to know when something is wrong and help us to track it down and fix the problem. This session will walk you through a series of PowerShell scripts you can schedule which will capture the most important data and a set of reports to show you how to use that data to keep your server running smoothly.
    SQL427: Manage SQL Server 2012 on Windows Server Core with PowerShell Windows Server 2008 introduced Server Core, the operating system without the Graphical User Interface. SQL Server 2012 is the first version of SQL Server to support Server Core. To gain the maximum efficiency from your servers you want to make use of Server Core and with PowerShell you can effectively manage SQL Server in that environment. This session will demonstrate how to manage SQL Server 2012 on Windows Server Core and provide scripts to help you perform the common tasks you need to perform on that platform.


    On April 14, I'll be in Atlanta at SQL Saturday 111 in Atlanta, presenting the Manage SQL Server 2012 on Windows Server Core with PowerShell session from SQL Connnections.

    The Akron-Canton Chapter of AITP invited me to speak at their montly meeting on April 24, where I'll be presenting Scalable Application Design with Service Broker.

    I was excited to be selected for this year's PASS SQL Rally conference in Dallas May 10-11, where I'll be presenting the Manage SQL Server 2012 on Windows Server Core with PowerShell session as well.

    Finally, and I am really excited about this opportunity, SQL Sentry asked me to represent them on this May's SQL Cruise to Alaska. The SQL Cruise is a great opportunity for professionals to get focused training on SQL Server from some of the top professionals in the SQL Server industry. If you look at the cost per training day, including hotel and meals, and compare it to the cruise, you'll find that the cruise works out as a better price than traditional training, and you'll get to see some of the most beautiful landscapes in the country! I'll be presenting a four hour version of my PowerShell workshop, and I'd love to see you there.

    Well, that's what's on the schedule for the spring. The summer will be a bit less busy, but I plan to get to a few events over the summer as well. I hope to see you at some of these great training events!


  • Set and Verify the Retention Value for Change Data Capture

    Last summer I set up Change Data Capture for a client to track changes to their application database to apply those changes to their data warehouse. The client had some issues a short while back and felt they needed to increase the retention period from the default 3 days to 5 days. I ran this query to make that change:

    sp_cdc_change_job @job_type='cleanup', @retention=7200

    The value 7200 represents the number of minutes in a period of 5 days. All was well, but they recently asked how they can verify that the change was made. Well, it turns out the values are stored in the msdb database, in a table called dbo.cdc_jobs. I put together this quick query to allow them to test any database with CDC enabled, to find the retention value:

    SELECT [retention]
      FROM [msdb].[dbo].[cdc_jobs]
      WHERE [database_id] = DB_ID()
      AND [job_type] = 'cleanup'

    It's simple, but without knowing where to look, it can be confusing.


This Blog


Privacy Statement