THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

Merrill Aldrich

All Those Extra Bits We Need for DR

If you’re running mirrored databases or log shipping for disaster recovery, no doubt you’ve faced the pain of keeping all the ancillary objects up to date on your DR server, such as SQL Agent jobs, logins, database owners, custom error messages, and so on. I’ve been trying to tighten things up at our place of business in this area, and came up with a few scripts I thought I’d share. Some of these ideas come indirectly from Thomas Grohser, and others, who have extensive DR experience and have been kind enough to speak or blog. Here I’ll post three scripts (for logins, jobs and ownership) that each demonstrate a slightly different technique for fetching and/or migrating these extra objects between servers.

Framework

As a basic setup, I’ve got an “admin” database on each SQL instance, where I can store DR and performance information on an ongoing basis. Those admin databases can be mirrored or log-shipped around to provide availability for this information, should a primary server go down. Today we’ll look at scripts to populate those little databases with the required objects from master and msdb to be able to bring a DR instance online.

I have settled on a naming convention for these that avoids collisions as they are copied or mirrored around the enterprise: DBAdmin_<servername>[_<instancename>]. Consistent naming, following this pattern, allows for some nice script re-use, and allows the databases to be “bi-directionally” mirrored between a primary and failover server. That is, the primary server’s admin database is copied to it’s failover partner, and vice versa.

Owners

First, we have the issue of database ownership – depending on exactly how all the databases on a DR instance were restored to initialize mirroring or log shipping, the database owners might be correct, or might all be set to the principal that performed the restore instead of the correct owner. Since this is sort of an unknown, it’s important to have the ability to set all the owners should the databases have to be brought online in an emergency. There’s a bit of a catch-22 here, because the ownership can’t be set correctly while the databases are offline/in recovery. I find that I need to have the ownership information handy, just in case this needs to be corrected in a pinch and the primary server is down (or, knock wood, gone altogether). Luckily, this one is super-simple to set up, as it’s basically just a little dynamic-sql-flavored query.

First, I create the DBAdmin_myServer database, if it doesn’t exist. Then, a table to store the script that would be required to set all the owners of all the databases on the instance:

CREATE TABLE [dbo].[DR_DatabaseOwnerScripts](
   [AsOfDate] [datetime] NOT NULL,
   [DatabaseOwnerScript] [varchar](max) NOT NULL
) ON [PRIMARY]
GO

ALTER TABLE [dbo].[DR_DatabaseOwnerScripts] 
ADD  CONSTRAINT [DF_DR_DatabaseOwnerScripts_AsOfDate]  
  DEFAULT (getdate()) FOR [AsOfDate]
GO

Next, a simple stored procedure can take the information from system tables and assemble a script that, if run, would set all the database owners. The resulting script text is filed away, by date, in the table:

CREATE PROCEDURE [dbo].[recordDatabaseOwnerScript] AS
BEGIN
 
DECLARE @ownerscript varchar(MAX);
 
SET @ownerscript =
'/*
 
Script to set ownership on all databases
Use this script to apply correct owners at the DR side of mirrored
or log-shipped databases, in the event of failover
 
'
 
SET @ownerscript = @ownerscript
+ 'Script generated ' + convert( varchar(50), GETDATE() ) + '
 
*/
 
';
 
SELECT
      @ownerscript = @ownerscript
      + 'ALTER AUTHORIZATION ON DATABASE::' + QUOTENAME( d.[name] )
      + ' TO ' + QUOTENAME ( syslogins.[name] ) + ';' + CHAR(13)
FROM [master].sys.databases d
      INNER JOIN [master].sys.syslogins ON d.owner_sid = syslogins.sid
WHERE d.[name] NOT IN ( 'master', 'model', 'msdb', 'tempdb' )
      AND d.state_desc = 'ONLINE';
 
PRINT @ownerscript;
 
INSERT  dbo.DR_DatabaseOwnerScripts
            ( DatabaseOwnerScript )
VALUES  ( @ownerscript );
 
END
GO

This procedure goes into a SQL Agent job, run daily. The job also has a clean-up step that deletes all but the last 5 or so entries, so the table doesn’t grow over time.

Should “bad things” happen, I feel like it’s vital that my team be able to get to this information quickly; the script may become larger than the typical SSMS field would hold, and that would be no fun in a crisis. So, I have a second stored procedure to simply fetch the last version of the script and print it out:

CREATE PROCEDURE [dbo].[getDatabaseOwnerScript] AS
BEGIN
      DECLARE @ownerscript varchar(MAX);
 
      SELECT TOP 1 @ownerscript = DatabaseOwnerScript
      FROM dbo.DR_DatabaseOwnerScripts
      ORDER BY AsOfDate DESC;
 
      PRINT @ownerscript;
END
GO

Jobs

SQL Agent jobs are the next Gotcha – if your system fails over, and you don’t have those log backup jobs running, a full log file or disk is definitely in your future. There might be essential application logic in jobs, too, which would be lost. Agent Jobs can be set up on the DR server and set to disabled, but I feel like it’s easy to miss some if I have tens or hundreds of instances to manage. So this logic will automate storing a script version of the jobs, just to be on the safe side.

First another Admin/script table:

CREATE TABLE [dbo].[DR_JobScripts](
      [AsOfDate] [datetime] NOT NULL,
      [jobName] [varchar](255) NOT NULL,
      [jobScript] [varchar](max) NOT NULL
) ON [PRIMARY]
GO

And this time, a PowerShell script to populate the table, in order to take advantage of SMO’s built-in scripting:


param( $sqlServerName = $(throw 'Parameter $sqlServerName is required') )
 
# Name of the database where the output from this script will be stored
# by default use the a prefix and the server name, but replace \ with _
# in case of named instance:
$adminDBName = 'DBAdmin_' + ( $sqlServerName -replace '\\', '_' )
$jobsTable = 'dbo.DR_JobScripts'
$scriptTime = (get-date)
 
# Create an instance of SMO.Server connected to the target SQL instance
$srv = New-Object Microsoft.SqlServer.Management.SMO.Server $sqlServerName
 
# Create a data table in which to stage the results
$jobsDT = New-Object System.Data.DataTable
$jobsDT.Columns.Add( ( New-Object System.Data.DataColumn asOfDate, ([DateTime]) ) )
$jobsDT.Columns.Add( ( New-Object System.Data.DataColumn jobName, ([String]) ) )
$jobsDT.Columns.Add( ( New-Object System.Data.DataColumn jobScript, ([String]) ) )
 
# Script all Agent jobs to the data table
$srv.JobServer.Jobs | ForEach {
   $newRow = $jobsDT.NewRow()
   $newRow["AsOfDate"] = $scriptTime
   $newRow["jobName"] = $_.name
   $newRow["jobScript"] = [string]($_.Script())
   $jobsDT.Rows.Add( $newRow )
}
 
# Write the results out to the admin database
$destCon = "Data Source=$sqlServerName;Initial Catalog=$adminDBName;Integrated Security=True;"
$bulkCopier = New-Object Data.SqlClient.SqlBulkCopy $destCon
$bulkCopier.DestinationTableName = $jobsTable
$bulkCopier.WriteToServer( $jobsDT )

This little script will just connect to SQL Server, loop over all the jobs, script each one out to a string, then load the collection of scripts via a DataTable into the admin database jobs table. Once this is working, it gets the same treatment as the owners script: schedule in Agent, and add a step to purge out older entries from the table.

In order to output the job scripts with little effort, I have the same sort of stored procedure as for owners, but this one has to iterate over all the job scripts that were recorded last, since they are in different rows in the table:

CREATE PROCEDURE [dbo].[getJobScript] AS
BEGIN
      DECLARE @jobscript varchar(MAX);
      DECLARE @lastSet datetime;
     
      SELECT @lastSet = MAX( asOfDate )  FROM dbo.DR_JobScripts;
     
      DECLARE curScripts CURSOR LOCAL FAST_FORWARD FOR
          SELECT jobScript
          FROM dbo.DR_JobScripts
          WHERE AsOfDate = @lastSet
          ORDER BY jobName asc;
 
      OPEN curScripts
     
      FETCH NEXT FROM curScripts INTO @jobscript
      WHILE @@FETCH_STATUS = 0
      BEGIN
         PRINT @jobscript;
         FETCH NEXT FROM curScripts INTO @jobscript 
      END
     
      CLOSE curScripts
      DEALLOCATE curScripts
     
END
GO

Ack! A cursor!

Alright, settle down kids. It’s just a loop :-). Nothing to see here. I said we’d look at some different methods! You see, the PRINT command has this overflow problem, which I’ll get to in a sec, so storing each script separately, and then printing them separately, reduces the odds that we’ll hit the limit of PRINT. Yeah, “for XML” probably works faster.

Logins

This is maybe the nastiest one. If you are in the unfortunate place where you have a server using individual SQL logins per user, then you probably have quite a list to manage, plus passwords and the potential for orphaned users, etc. The best medicine I know of is the sp_help_revlogin procedure. I still meet people who don’t know this trick, and spend time haggling with orphaned users. This is super easy, and makes me look smart! I only wonder why it doesn’t ship with SQL Server in the first place.

Anyway, another simple PowerShell script can execute sp_help_revlogin and stash the results, as long as we can capture and process infomessage events (the events that cause stuff to print in the SSMS message pane instead of the results pane). So, I install sp_help_revlogin from the KB, then set up another little table:

CREATE TABLE [dbo].[DR_LoginScripts](
      [AsOfDate] [datetime] NOT NULL,
      [loginScript] [varchar](max) NOT NULL
) ON [PRIMARY]
GO

This PowerShell script will populate the table based on sp_help_revlogin:


param( $sqlServerName = $(throw 'Parameter $sqlServerName is required') )
 
# Name of the database where the output from this script will be stored
# by default use the a prefix and the server name, but replace \ with _
# in case of named instance:
 
$adminDBName = 'DBAdmin_' + ( $sqlServerName -replace '\\', '_' )
$loginsTable = 'dbo.DR_LoginScripts'
$scriptTime = (get-date)
 
# Execute and capture output from sp_help_revlogin
 
$outputScript = ""
 
$SqlCon = New-Object System.Data.SqlClient.SqlConnection
$SqlCon.ConnectionString = "Server=$sqlServerName;Database=master;Integrated Security=True"
 
$SqlCmd = New-Object System.Data.SqlClient.SqlCommand
$SqlCmd.Connection = $SqlCon
$SqlCmd.CommandText = "EXEC dbo.sp_help_revlogin"
 
$SqlCmd.Connection.Open()
 
# Capture infomessages from the stored proc execution subscribing to connection events
 
Register-ObjectEvent -inputObject $SqlCon -eventName "InfoMessage" -sourceIdentifier "SQL InfoMessages"
$SqlCmd.ExecuteNonQuery()
Unregister-Event -sourceIdentifier "SQL InfoMessages"
 
$SqlCon.Close()
 
Get-Event -SourceIdentifier "SQL InfoMessages" | ForEach {
   $outputScript += $_.SourceEventArgs.Message
   Remove-Event -EventIdentifier $_.EventIdentifier
}
 
# Create a data table in which to stage the results
$loginsDT = New-Object System.Data.DataTable
$loginsDT.Columns.Add( ( New-Object System.Data.DataColumn asOfDate, ([DateTime]) ) )
$loginsDT.Columns.Add( ( New-Object System.Data.DataColumn loginScript, ([String]) ) )
 
# Add one datatable row, with the login script
$newRow = $loginsDT.NewRow()
$newRow["AsOfDate"] = $scriptTime
$newRow["loginScript"] = $outputScript
$loginsDT.Rows.Add( $newRow )
 
# Write the results out to the admin database
$destCon = "Data Source=$sqlServerName;Initial Catalog=$adminDBName;Integrated Security=True;"
$bulkCopier = New-Object Data.SqlClient.SqlBulkCopy $destCon
$bulkCopier.DestinationTableName = $loginsTable
$bulkCopier.WriteToServer( $loginsDT )

Same drill as the previous example, but taking advantage of events to capture the output of PRINT from sp_help_revlogin.

And, yet again, we need a quick way to fetch the results. Here we have sort of a problem. The script that results from sp_help_revlogin might be quite long, for a high number of logins, and might easily overflow the simple output options in SSMS. PRINT will truncate a really long string, as shown in the following examples (these all end by outputting an exclamation point (!) so that you can easily tell if the output is truncated if you want to try the examples):

-- This long value will overflow a single print statement
-- and be truncated
DECLARE @longval varchar(MAX)
SET @longval = REPLICATE( 'abcdefg' + CHAR(13), 8000/8 ) + '!'
PRINT @longval
GO
 
-- This one just fits under the size limit
DECLARE @longval varchar(MAX)
SET @longval = REPLICATE( 'abcdefg' + CHAR(13), 7999/8 ) + '!'
PRINT @longval
GO
 
-- But chunking the output into separate PRINT calls
-- can allow more output
DECLARE @i int
SET @i = 1
WHILE @i < 10000
BEGIN
      PRINT 'More Text to Console in Separate Statements'
      SET @i += 1
END
PRINT '!'
 

So, if we’re storing long scripts, it can be helpful to have some output routine that will “chunk” the large script into SSMS-digestible bites (pun!). Here’s one such technique: this one just breaks the input string on line breaks, and spits out one line per PRINT statement:

CREATE PROCEDURE [dbo].[getLoginScript]
AS
    BEGIN
        DECLARE @loginscript varchar(max) ;
 
        SELECT TOP 1
                @loginscript = loginScript
        FROM    dbo.DR_loginScripts
        ORDER BY AsOfDate DESC ;
 
        DECLARE @start int,
            @end int,
            @output varchar(8000) ;
       
        SET @start = 1 ;
        WHILE @start < LEN(@loginscript)
            BEGIN
                SET @end = CHARINDEX(CHAR(10), @loginscript, @start + 1) ;
                IF @end = 0
                    SET @end = LEN(@loginScript) + 1 ;
                SET @output = REPLACE(
                                REPLACE(
                                  SUBSTRING(@loginscript, @start,  @end - @start),
                                  CHAR(13), '')
                              , CHAR(10), '') ;
                PRINT @output ;
                SET @start = @end + 1 ;
            END
    END
GO

It’s also possible to use a common string-splitting function to cut the script up into smaller pieces.

Anyway, that’s where I am so far on this. If you’ve got observations, tips, techniques, I’d love to hear them. Cheers!

Published Tuesday, November 23, 2010 6:28 PM by merrillaldrich

Comment Notification

If you would like to receive an email when updates are made to this post, please register here

Subscribe to this post's comments using RSS

Comments

 

WayneS said:

There is also a SSIS Task to transfer user logins, complete with transferring the SID just like the sp_help_revlogin procedure does.

November 23, 2010 11:40 PM
 

merrillaldrich said:

Hi WayneS - that can be a good solution too (there's also a SSIS task for jobs, IIRC). But last time I looked at it, it didn't bring passwords across for SQL logins. For me, that's kind of a deal-breaker. I really like the hashed passwords from sp_help_revlogin.

November 23, 2010 11:57 PM
 

AaronBertrand said:

Nice post Merrill.  This is exactly the set of things that I hope contained databases will eventually solve.  Denali will be a good start, but certainly nowhere near complete.

November 24, 2010 4:22 PM
 

Chris Howarth said:

Nice approach.

We perform a similar process (performed mostly in T-SQL) extended to include SQL Agent Operators, Alerts, Alert Notifications, Policies, sys.configurations values, custom error messages, server role membership etc...

To get around the SSMS output truncation issue we additionally BCP each server's scripts out to files on a central network share on a nightly basis. This enables quick and easy retrieval of the scripts should they be required for any reason.

November 24, 2010 4:56 PM
 

David Wimbush said:

Hi Merrill.

Thanks for this. I'll definitely be adding some of this to my DR setup. One drawback with sp_help_revlogin is that it doesn't script the login's default language, which could be bad. I added this ability and blogged about it here: http://sqlblogcasts.com/blogs/davidwimbush/archive/2010/04/09/copying-logins-to-another-server.aspx. I submitted it as feedback on the KB article but it hasn't had any effect. Maybe they're a bit busy on something else at the moment!

November 25, 2010 2:40 AM
 

AaronBertrand said:

David, probably a better mechanism than direct feedback is to use Connect (http://connect.microsoft.com/sql/).  Here is a recent example where there is active dialog (and potential action) about some issues in a KB article:

http://connect.microsoft.com/SQLServer/feedback/details/594910/issues-in-kb-article-2154845

Also the people who would correct a KB article are not the same people that are working on Denali, so no worries there.  :-)

November 25, 2010 10:12 AM
 

AaronBertrand said:

Oh, and FWIW, I would *not* expect any improvements to existing sp_ stored procedures.  They stay away from changing those interfaces as much as possible because they could break existing code.

November 25, 2010 11:44 AM
 

Merrill Aldrich said:

I got some positive feedback on the last post, All Those Extra Bits We Need for DR , so I wanted to put

November 26, 2010 5:17 PM
 

Robert L Davis said:

There is some great stuff you have here, but I would make one suggestion. There is a much better approach to handling the logins. I wrote a script that transfers logins, roles, and explicit permissions via T-SQL across a linked server conenction. It doesn't use sp_help_revlogin. When it transfers a SQL login, it creates it with the same password (using the password hash) and the same SID.

You can find it here: http://www.sqlsoldier.com/wp/sqlserver/transferring-logins-to-a-database-mirror

November 27, 2010 1:08 PM
 

merrillaldrich said:

Thanks all - Robert, that's some great stuff, thanks for the link

November 27, 2010 1:47 PM

Leave a Comment

(required) 
(required) 
Submit

This Blog

Syndication

Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement