THE SQL Server Blog Spot on the Web

Welcome to - The SQL Server blog spot on the web Sign in | |
in Search

Allen White

Script and Migrate Agent Jobs between Servers using PowerShell

I'm nearing the end of a SQL Server upgrade project, migrating from SQL Server 2000 to SQL Server 2008 R2. The company uses SQL Server Agent to run about 150 jobs which execute DTS packages to do the work of moving data around. Now, there are any number of ways to move these jobs from one server to the other, but I wanted to be able to exclude certain jobs, make some minor changes before loading them onto the new server, that kind of thing. PowerShell makes that relatively easy to do.

First, most of the jobs were owned by some login, many of which represent people no longer with the company. This happens over time, but should be cleaned up. To quickly modify the jobs to all be owned by 'sa', it's just a few lines of code:

# Connect to the instance using SMO
$s = new-object ('Microsoft.SqlServer.Management.Smo.Server') $inst

# Iterate through the jobs to reset the job owner
foreach ($job in $s.JobServer.Jobs) {
	if ($job.Owner -ne 'sa') {
		$job.OwnerLoginName = 'sa'

My scripts usually pass in $inst as the SQL Server instance name as a command-line argument, so I can use the script on any server. Once this script was done all the jobs are owned by sa, and I don't have to worry about enabling old logins on the new server. Next thing is to script out the jobs into .sql files, one for each job. In this case I pass in a base directory path as a second command-line argument. Then I use PowerShell to create the Jobs directory under that (note that if you're using a named instance, I'd recommend using the encode-sqlname cmdlet to turn the $inst variable into an encoded name to prevent the backslash from creating problems):

# Create the Database root directory if it doesn't exist
if (!(Test-Path -path "$base\$inst\Jobs\"))
	New-Item "$base\$inst\Jobs\" -type directory | out-null
$homedir = "$base\$inst\Jobs\"

The next step is to set the properties for the script files:

# Instantiate the Scripter object and set the base properties
$scrp = new-object ('Microsoft.SqlServer.Management.Smo.Scripter') ($s)
$scrp.Options.ScriptDrops = $False
$scrp.Options.WithDependencies = $False
$scrp.Options.IncludeHeaders = $True
$scrp.Options.AppendToFile = $False
$scrp.Options.ToFileOnly = $True
$scrp.Options.ClusteredIndexes = $True
$scrp.Options.DriAll = $True
$scrp.Options.Indexes = $True
$scrp.Options.Triggers = $True

Then we can cycle through the jobs and create scripts for each job on the server.

# Create the script file for each job
foreach ($job in $s.JobServer.Jobs) {
	$jobname = $job.Name
	$scrp.Options.FileName = "$homedir\$jobname.sql"

There are some jobs on the original server that won't move - things like maintenance plan jobs, which I'm replacing with more efficient methods, so having individual script files allows me to simply delete the script files for the jobs not moving to the new server. Once I've done that I have another script which will load the script file into a PowerShell variable, and using invoke-sqlcmd, create the job on the new server. Now, each script file ends with a 'GO' statement, and invoke-sqlcmd doesn't like that, so I want to strip it out, which is easy enough to do with the Replace method. The problem is that it also contains a GOTO command to bypass the error notification. I use the Replace() method to get around that by first replacing GOTO with OTOG, then replacing GO with an empty string, then replacing OTOG again with GOTO.

The other 'tricky' thing I had to do was to convert the script file to a PowerShell string. I use the Get-Content cmdlet to pull the script into a PowerShell object. I then use the here-string construct with the PowerShell object inside the here-string, and that converts it from an object to a System.String object for me, and then I can easily supply that string as the query for invoke-sqlcmd.

# Get all the files with a .sql extension in the directory and cycle through them
$files = get-childitem $basedir | where {$_.Name -like '*.sql'}
foreach ($file in $files) {
	$fname = $file.Name
	$sql = get-content "$basedir$fname"
	$qry = @"
	$qry = $qry.Replace('GOTO','OTOG')
	$qry = $qry.Replace('GO','')
	$qry = $qry.Replace('OTOG','GOTO')
	invoke-sqlcmd -ServerInstance $inst -Database master -Query $qry

Now the jobs are loaded on the new server, but of course the problem is that they're ACTIVE. Until we go live we don't want them running, so we need to disable them en masse. That's pretty simple as well.

$jobs = $s.JobServer.Jobs 
foreach ($job in $jobs) {$job.IsEnabled = $False; $job.Alter()}

And now the jobs are all in place, and ready to be enabled once the green light is given on the migration.


Published Wednesday, May 02, 2012 12:36 PM by AllenMWhite

Comment Notification

If you would like to receive an email when updates are made to this post, please register here

Subscribe to this post's comments using RSS



Laerte Junior said:

Hi Allen, as always a nice Powershell example. In my humble opinion we can change tha lines in the script to replace GO in the script process and using regex. This way we dont need to search in the .sql files. Something like

($scrp.Script($job) ) -replace "\bGO\b","" | Out-File "$($homedir)\$($jobname).sql"

Nice job Sir :)

May 2, 2012 12:44 PM

Andrew Neuman said:

You're my hero! I'm in the process of migrating and/or upgrading all of our servers to 2008R2 and was thinking about ways to migrate the agent jobs.

May 2, 2012 10:10 PM

Chad Miller said:

If you use the SMO method ExecuteNonQuery which is part of the Database class, you don't need to strip out the GO statements as ExecuteNonQuery understands GO:

#See BatchSeparator property



#Using ExecuteNonQuery

$s = new-object ('Microsoft.SqlServer.Management.Smo.Server') $inst

$d = $s.databases["msdb"]


May 6, 2012 3:51 PM

Allen White said:

This T-SQL Tuesday is about using PowerShell to do something with SQL Server. Now, if you've read any

February 12, 2013 1:42 PM

Marios Philippopoulos said:

Hi Allen, this article was very useful to me; I have used your code to migrate jobs from one server instance to another.

Regarding the last bit of code that applies the scripts to the target server, I think doing it like below avoids having to worry about GO statements:

$newProd = "XXXX\XXXXXXXX";

$homedir = "...\";

$files = get-childitem $homedir | where {$_.Name -like '*.sql'};

foreach ($file in $files)


  Invoke-Sqlcmd -ServerInstance $newProd -Database "UTILITY" -InputFile $homedir$file;


November 12, 2013 4:16 PM

iqbal said:

i want learn how jobs are created and how a particular script can be associated with a newly created job..

December 28, 2013 5:26 AM

AllenMWhite said:

Iqbal, here's an article I recently wrote that'll help you.

December 28, 2013 9:00 AM

Jordan Parker said:

Awesome article.

One thing I would add is that $job.IsEnabled can be set to $False right before scripting (so that we don't have to come back and set IsEnabled after running the scripts).  Since we're not altering the $job object on the origination server, it's a safe operation.

July 15, 2014 1:27 PM

Paul said:

I found unless I cut/paste in SSMS (QE), Agent Job Tokens do not get updated correctly 2008 -> 2014 for $(ESCAPE...

June 4, 2015 3:38 PM

Leave a Comment


About AllenMWhite

Allen White is a consultant and mentor for Upsearch Technology Services in Northeast Ohio. He has worked as a Database Administrator, Architect and Developer for over 30 years, supporting both the Sybase and Microsoft SQL Server platforms over that period.

This Blog


Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement