THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

The Rambling DBA: Jonathan Kehayias

The random ramblings and rantings of frazzled SQL Server DBA

Downgrading from SQL 2008 to 2005

Every couple of weeks on one of the forums someone will ask a question about how to restore a backup from SQL 2008 to SQL 2005.  The answer to the question is always, you can’t restore a backup to a lower version of SQL  Server.  If you need to migrate backwards it is an entirely manual process.  I learned this unfortunate truth a few years ago after upgrading a development server to SQL Server 2005 ahead of upgrading the production server for testing by the developers of our eCommerce application.  At the time there was a new database for another web project in development that didn’t exist in production, which was on SQL 2000 still, and when I had to move it to QA, also on SQL 2000 still, what was originally planned as a backup/restore turned into a “What do I do now?” moment.

There are a number of ways to downgrade a database in SQL Server, but in general it basically boils down to recreating the database from scripts on the older version and importing data back into the database.  A few years ago, I used SSIS to simplify this task with the Transfer Database Objects Task which can be used to move all the objects in a source database to a target database along with the data in a single step.

Last week during what was expected to be a routine upgrade of a vendor application to the latest version we ran into a problem with the upgrade failing reporting that the version of SQL Server was unsupported for the application.  Fire up profiler to see what the upgrade is doing and essentially this boiled Quick check and the server has been on SQL 2008 for well over a year, so we call up the vendor and after a few hours of discussion, we would have been better off talking to a brick wall, the end result was that the application wasn’t supported on SQL 2008 and they didn't care that it had been running there for over a year.

History repeats itself, or so they say, and once again I was faced with having to downgrade a SQL Server system.  Only this time, I wasn’t moving one database, I was going to have to downgrade just over eighty databases.  I opened up SSIS and soon realized I had no idea what I was doing.  I threw some comments/questions out on twitter.  My first problem was I couldn’t figure out how to loop over databases in a Foreach Loop Container and Andy Leonard threw me a few links and then followed up with a complete SSIS solution to look at.  Problem solved or so I thought. 

I knew from past experience that you could use variables to configure task steps dynamically so I figured that now armed with Andy’s example I would be able to quickly create a SSIS package to loop over the databases and use variables to configure the Source and Target Databases in the Transfer Database Objects Task to move the databases.  Well that is what I thought initially, and much to my frustration after a few hours playing around with this, I got an email back from Andy confirming that there was no known way to use variables in the Transfer Database Objects Task like I was trying to do.

(This is where this weeks TSQL Tuesday picks up)

Since I couldn’t configure a Foreach Loop Container to loop over each database using variables to set the source and destinations of a child Transfer Database Objects Task, I started out manually creating individual Transfer Database Objects Tasks for each of the databases.  For some reason I thought to test this about four databases into building the package and I realized that due to design issues of the database like schema objects not being fully qualified inside of view definitions I actually needed at least two transfer tasks per database.  It was at this point I decided that there was no way in the world that I was going to create that many SSIS objects.  Since SSIS works through SMO, I jumped out to PowerShell and began playing with a couple of things.  However, being impatient like I am, I got tired of having to constantly lookup properties and methods for the objects I was working with, so I jumped back in the Visual Studio and went to C# which has Intellisense built in and made finishing the code very fast.

The end result was a short (<100 lines of code) C# console app that moved the databases from 2008 back to 2005:

using System;
using System.Collections.Generic;
using System.Linq;
using System.Text;
using Microsoft.SqlServer.Management.Smo;
using System.Diagnostics;
using System.Collections.Specialized;

namespace Transfer2008To2005
{
   
class Program
    {
       
static void Main(string[] args)
       
{

           
string sourceserver = "SourceServerName";
           
string destinationserver = "DestinationServerName";

           
Server src = new Server(sourceserver);
           
Server dest = new Server(destinationserver);

           
string[] myarray = new string[] { "databasename", "otherdbname" };

           
foreach (Database db in src.Databases)
           
{
               
if (db.IsSystemObject == false && db.Status == DatabaseStatus.Normal)
               
{
                    Debug.WriteLine
("Now moving " + db.Name);

                   
if (dest.Databases.Contains(db.Name))
                       
dest.KillDatabase(db.Name);

                   
Transfer t = new Transfer(db);

                   
t.DestinationServer = destinationserver;
                   
t.DestinationDatabase = db.Name;
                   
t.CreateTargetDatabase = true;
                   
t.PreserveDbo = true;
                   
t.TargetDatabaseFilePath = "D:\\SQLDATA\\";
                   
t.TargetLogFilePath = "F:\\SQLLOGS\\";

                   
t.CopyAllDefaults = true;
                   
t.CopyAllFullTextCatalogs = true;
                   
t.CopyAllRoles = true;
                   
t.CopyAllRules = true;
                   
t.CopyAllSchemas = true;
                   
t.CopyAllStoredProcedures = true;
                   
t.CopyAllSynonyms = true;
                   
t.CopyAllTables = true;
                   
t.CopyAllUserDefinedDataTypes = true;
                   
t.CopyAllUserDefinedFunctions = true;
                   
t.CopyAllUserDefinedTypes = true;
                   
t.CopyAllUsers = true;
                   
t.CopyData = true;
                   
t.CopySchema = true;

                   
t.CopyAllObjects = false;
                   
t.DropDestinationObjectsFirst = true;

                   
t.Options.WithDependencies = true;
                   
t.Options.IncludeDatabaseRoleMemberships = true;
                   
t.Options.Indexes = true;
                   
t.Options.DriAll = true;
                   
t.Options.Permissions = true;
                   
t.Options.SchemaQualify = true;
                   
t.Options.SchemaQualifyForeignKeysReferences = true;
                   
t.Options.Statistics = true;
                   
t.Options.TargetServerVersion = SqlServerVersion.Version90;
                   
t.Options.WithDependencies = true;
                   
t.Options.IncludeIfNotExists = true;
                   
t.Options.FullTextIndexes = true;
                   
t.Options.ExtendedProperties = true;

                   
t.TransferData();

                   
t = new Transfer(db);

                   
t.DestinationServer = destinationserver;
                   
t.DestinationDatabase = db.Name;
                   
t.DestinationLogin = "schemalogin";
                   
t.DestinationPassword = "schemapassword";
                   
t.DestinationLoginSecure = false;

                   
t.CopyAllViews = true;
                   
t.CopyAllTables = true;
                   
t.CopyAllObjects = false;
                   
t.CopyAllDatabaseTriggers = true;
                   
t.Options.Triggers = true;
                   
t.CopyData = false;
                   
t.CopySchema = true;
                   
t.Options.IncludeIfNotExists = true;

                   
t.TransferData();
               
}
            }
        }
    }
}

Due to the schema issues, I had to double step the migration for a majority of the databases using the schema owner to connect and migrate the views backwards to SQL 2005.  This was still much easier than trying to actually work with the GUI for SSIS in Visual Studio.  However, I had to write a separate version of code to move one database based on the following observations I had, despite specifying the WithDependencies = true option.  Even when specifying this option the following must be considered:

  • Create the Database and Roles first, the dependency walker doesn’t seem to take into account that a role must exist before a user can be assigned to it, and the database principals are generated in alphabetical order.
  • Foreign Key Constraints and DML Triggers are problemattic when manually downgrading a database.  If possible, create both of these object types after the data has been loaded into the tables, or disable the objects before loading the data into the database.  DML Triggers will fire and can cause Primary Key Constraint violations, especially when attempting to load data into identity seed tables that have identical identity values as rows generated by triggers firing.

This required a three step process to move the database, but by setting the correct options anything is possible.

Published Monday, January 11, 2010 11:51 PM by Jonathan Kehayias
Filed under:

Comments

 

The Rambling DBA: Jonathan Kehayias said:

This week I have two posts for TSQL2sday .&#160; Initially I wrote my first blog post , but as luck would

January 11, 2010 11:26 PM
 

Greg Linwood said:

Another option is to use Replication, which can automate all of the schema scripting + data export / import. If Primary Keys are in place, Transactional Replication can also be used to keep a downgrade path server in synch, providing a much faster fail-back option than SSIS or other techniques, which is sometimes important during version upgrades on HA systems. Despite all best efforts at testing, many businesses like to know they have a fast fail-back option in the event things go wrong (c:

January 12, 2010 2:31 AM
 

John Paul Cook said:

I've had great, easy success using Red Gate's tools to do this.

January 12, 2010 9:37 AM
 

Jonathan Gardner said:

There is nothing like learning things the hard way.

January 12, 2010 11:14 AM
 

Adam Machanic said:

According to T-SQL Tuesday rules as ratified by me in the first and second T-SQL Tuesday posts, the T-SQL

February 8, 2010 2:14 PM
 

Ian Miller said:

For John Paul Cook:  You mention you have had success using Red Gate's tools to downgrade a database - which tools and how?

February 8, 2011 1:36 PM
 

Jonathan Kehayias said:

Ian,

The tools from Redgate are SQL Compare and SQL Data Compare.  I've used them twice since writing this blog post myself and they do a very good job.

http://www.red-gate.com/products/sql-development/sql-compare/

http://www.red-gate.com/products/sql-development/sql-data-compare/

February 8, 2011 3:51 PM
Anonymous comments are disabled

This Blog

Syndication

Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement