THE SQL Server Blog Spot on the Web

Welcome to - The SQL Server blog spot on the web Sign in | |
in Search

Linchi Shea

Checking out SQL Server via empirical data points

Executing a large number of SQL scripts

Say, you are given a large number of scripts underneath the following directory tree:









You need to execute these scripts on a SQL Server instance. To simplify the matter a bit, let’s assume that these scripts can be executed in any order. How would you go about it?


Here’s how I would do it with the following trivial throw-away Perl script:


# script file name:

$path = 'N:\Apps\Release\PRS\PR12345';

$server = 'SQLNYC01';


@scripts = `dir /B /s $path`;

foreach $f (@scripts) {

   if ($f =~ /^(.+\.sql)\s*$/i) {


        $osql = "sqlcmd.exe -S$server -E -w250 -i" . $f;

        print "$osql \n";


#        $log = `$osql`;




I then run this little script and re-direct the result to a file:

cmd>perl > PR12345.bat


PR12345.bat is the batch file that you can run to execute all the SQL scripts. Note that this Perl script assumes that all the SQL script files have the .sql file extension. That is just an example to show that you can easily filter out or filter for specific types of files, and construct the batch as you wish.


You can also easily execute the SQL script files directly in the Perl script without using an intermediate batch file. The commented line in the script does precisely that (if uncommented).


The assumption that the SQL scripts can be executed without any particular order can be easily removed with just a little bit of more coding as long as a way to identify the order is provided. For instance, a list of the script file names may be included in some release notes, and they must be executed in that order. It is quite easy to take that list, and help sort the SQL script files along with their complete paths so that they are listed correctly in the batch file.


Published Thursday, December 4, 2008 9:15 PM by Linchi Shea



James Luetkehoelter said:

Perl! I love perl. I hate the batch file syntax - I'd always resort to perl. There was a day when any SQL Server I managed immediately got ActivePerl installed on it :)

Incidentally, you can also do some creative things like this with SSIS with the For and For Each containers. SSIS isn't just about moving data around...

Nice reminder Linchi that we don't necessarily need a complex or static solution to what seems on the service to be a complicated or time-consuming task.

Nice meeting you BTW!

December 4, 2008 8:27 PM

Linchi Shea said:

Would love to see some responses from PowerShell junkies.

December 4, 2008 8:46 PM

DonRWatters said:

You set me up...  It's even more trivial than the perl example.  This is not through SQLPS, but through the normal powershell, using the snapins from the SQL Server 2008 install (we need invoke-sqlcmd).

sl = 'N:\Apps\Release\PRS\PR12345'

gci -include *.sql -recurse | % {invoke-sqlcmd -server "SQLNYC01" -inputfile $_.fullname}

Save that as a ps1 file...done.

December 4, 2008 10:10 PM

Linchi Shea said:

Excellent, Don! What if we only need to look for SQL script files whose names starts with 'p_'?

In Perl, here's the only change I need to make:

  if ($f =~ /^(.+\\p\_[^\\]+\.sql)\s*$/i) {

Personally, I'm not a fan of PowerShell's reliance on cmdlets. I'm a fan of versatility. But you can prove me wrong, Don!

December 4, 2008 10:35 PM

Linchi Shea said:

Hey, Don;

You started it, and I felt compelled to respond :-)

Here's one liner Perl to go against your two-line PS:

print map {"sqlcmd.exe -SNYCSQL01 -E -w250 -i$_"} grep(/^(.+\\p\_[^\\]+\.sql)\s*$/i, `dir /B /s N:\\Apps\\Release\\PRS\\PR12345`);

December 4, 2008 10:54 PM

DonRWatters said:

Ok...first, I didn't HAVE to use invoke-sqlcmd, I could've just as easily used the executable, just as you have.  I don't really think it's a problem to use cmdlets.  You like to use PERL modules that were written by others, I assume.  Same thing, in my humble opinion.  Second, I don't believe that it's really a contest...but if it were, you're going to win, no matter what.  It's your blog. :)

December 4, 2008 11:40 PM

Linchi Shea said:

True, cmdlets and modules, what's the difference? Well, the difference is that in Perl I don't have to rely on modules to get a lot of work done. I don't mean to promote Perl at the expense of PowerShell. After all, PowerShell is more a shell than a scripting language. They are not in the same playing field. In the end, they actually complement each other very well for some tasks. For instance, a common scenario will probably be to wrap a Perl script around a PowerShell scriptlet and rely on the PowerShell scriptlet to get some SQL Server info, and then do the text processing in Perl.

And then, I don't really know PowerShell that well. So I truly wanted to see how far it could go in dealing with certain problems. That was the motivation.

Come on, Don!

December 5, 2008 12:55 AM

James Luetkehoelter said:

Oh, you guys are quibbling over details :) The overall message here is - don't build a yacht when a raft works just fine :)

December 5, 2008 1:01 AM

Scott R. said:

Linchi and Don,

I have “kicked the tires” a bit with PowerShell and can read Perl to a limited degree, but I am no expert in either.  However, I have observed a similarity between PowerShell and SQL – they are both set-based languages / environments that can also process data using explicit iteration (but maybe not as efficiently in all cases) – recordsets for SQL and .NET object collections for PowerShell.

Solving a problem in set-based form (implicit iteration, without using explicit iteration – loops, etc.) often results in a more efficient (run time and computer resource perspective – memory, CPU, etc.) and more elegant (compact – fewer lines of code to develop, debug, maintain, etc.) solution.  Explicitly iterative solutions are also possible in the respective scripting languages, but may not be as efficient.  Efficiencies are more likely to come when the iteration is pushed to the lowest levels by use of higher-level set-based operators.  There are exceptions to this set-based efficiency advantage, as many others have noted in past discussions on “the evils of cursors” in T-SQL.  It also assumes a mature, well-optimized set of higher-level set-based operators.

Perl has been around longer, but PowerShell seems to have some compelling uses.  A solution that can be done in a few lines by piping results between basic cmdlets and little or no other “scripting” (declared variables, explicit iteration, etc.) can be a thing to behold, once you get to know the cmdlets and their options.  However, examples I have seen of heavier PowerShell script usage can be cumbersome to read and understand.  Perhaps this is one of the issues that Linchi is exploring in this comparison.  My lack of deeper Perl knowledge puts me at the same disadvantage with some Perl examples.  I believe proficiency in either environment takes time, effort, much practice, and a good bit of trial and error in the learning process.

I still use a lot of VBScript and BAT scripts in work situations, because I am more familiar and proficient with them (recalling how much effort it took me to get from zero to here), and their run-time environments are pre-installed on any system I use.  Some companies have “policies” against using “other” scripting environments (for “security” reasons, staff training and knowledge, etc.).  I’m not saying those reasons are right, but they may exist and have to be addressed.

Personally, I hope to learn more about PowerShell and possibly Perl also, to have more options available in my tool box for a given situation.  While it might be nice to settle on one “best” language / environment, I believe that knowledge of a few useful tools in this area (and knowing when to use which one) can be an advantage.

It is a paradigm shift to think of solutions in set-based form (versus explicit iteration), much as it was a paradigm shift for event-based programming when VB first came out.  Many developers still struggle with the need to use explicit loops to process data within T-SQL, when set-based solutions may be available and more effective.  The developer thinking process is largely “item / row at a time” for most other programming aspects, and continues to use that same approach in areas where better solutions may exist.  I don’t always succeed here myself, but I continue to strive for set-based solutions in the work I do where possible, learning from past solutions and better examples along the way.  I recall similar struggles when learning APL in a “survey of programming languages” college class – all set-based operators with special characters for each operator – compact, “elegant”, but very cryptic.

To put it in Joe Celko terms: “Thinking in Sets”

Your thoughts and observations?

Let the debate continue!

Scott R.

December 5, 2008 7:40 AM

Linchi Shea said:


I think you've drunk too much SQL Kool-Aid to see set-based solutions as the hammer for the nails in the world beyond the relational databases :-)

December 5, 2008 9:08 AM

jerryhung said:

Mnn, I must suck to use 3rd party tools like RedGate SQL Multi Script, or xSQL Script Executor (add Folder)

I have .bat file using foreach, but that doesn't work with UNC

I have VBScript too..

Learning the handle of PowerShell every day

December 5, 2008 11:05 AM

Steve Fibich said:

Gotta go with Powershell

gci N:\Apps\Release\PRS\PR12345 -filter '*.sql' -recurse |% {sqlcmd -S SQLNYC01 -E -e -i $_.FullName}

No just one physical line of code, but one logical code statement.  Not all of us have the SQL 2008 PowerShell plugins installed.

December 5, 2008 12:53 PM

Scott R. said:


Not trying to change your opinion or drink any Kool-Aid! :-)

Sets of things to process are not limited to RDBMS analogies – tables, rows, etc.  Common hierarchies of collections / sets from non-RDBMS data sources are exposed through a growing set of PowerShell data providers (file systems, registry, WMI, Exchange, etc.) as well as SQL Server metadata and configuration info.

Reminds me of the Microsoft LogParser tool – initially awkward to learn how to use, but very productive once you get the hang of it.

Like most tools in any toolbox, one size rarely fits all.  Find the hammer(s) and nails that work for you.  I will continue to use VBScript and BAT scripts, and learn more about PowerShell and Perl to use as appropriate.

Best wishes on your comparisons!

Scott R.

December 6, 2008 12:58 PM

Linchi Shea said:


I was kidding on the SQL kool-aid thing. But I do contend that set-based solutions are more an exception than a norm in the real world in general. Set-based solutions are great in SQL because SQL optimizer (which is nothing but a set to procedure translator) has gotten a lot better. But not every real-world problem has a well-built optimizer to transalate a 'set-based' solution to a procedural one.

December 7, 2008 7:43 PM

Rob Farley said:

With the 'p' bit...

gci N:\Apps\Release\PRS\PR12345 -filter 'p*.sql' -recurse |% {Invoke-Sqlcmd -S SQLNYC01 -E -e -i $_.FullName}

And I'll raise you a "files that have appeared since yesterday"

Get-ChildItem N:\Apps\Release\PRS\PR12345 -filter 'p*.sql' -rec |

Where-Object {$_.CreationTime -gt (Get-Date).AddDays(-1)} |

% {Invoke-Sqlcmd -S SQLNYC01 -E -e -i $_.FullName}

(And I've removed the use of aliases, just to be clearer...)


December 8, 2008 10:16 PM

Linchi Shea said:


Here's the Perl:

use File::Find;

find sub {unshift @abc, $File::Find::name if -M $File::Find::name <= 1;}, 'N:\Apps\Release\PRS\PR12345';

print map {"sqlcmd.exe -SNYCSQL01 -E -w250 -i$_"} grep(/^(.+\\p\_[^\\]+\.sql)\s*$/i, @file);

December 9, 2008 11:35 AM

Linchi Shea said:

Oops! that @file should be @abc from the previous line.

But don't get me wrong, Rob. Since they've put so much effort into all these cmdlets, they've provided a lot of short-cuts into individual systems (SQL Server, etc) that are not available elsewhere, and I'll use these short-cuts by wrapping Perl around PowerShell. Perl is great in stiching disparate systems together, and PowerShell is good at providing the system-specific short-cuts. That could be a nice marriage.

December 9, 2008 11:47 AM

JJEugene said:

I have long used a solution that combines MS Access and Word.  I use MS Access to store a list of file names (in order and by categories).  It also includes features like description columns and ability to change what folder to find the code in.  Then there is a button on the Access form which uses automation to build a Master Document in Word.  

Of course, there are cons to this approach, but I love it (and have been using this approach without problem and with lots of joy for many years) because I get two things at once: 1) the ability to work with all my code in separate files, 2) a fast way not only to be able to compile all those files for running them when I need to re-create a database, but also a single place to do global search and replace across all my code (in the master document - which points back to the original files).  Want to change a column name? No problem, you will catch every instance of that column name used in code and be able to update in one step.  

Another reason I like this approach is that the Access .mdb gives me a very clear tree-view of my code.  I think it is good documentation for me and any future person who replaces me.

Just sharing.

December 11, 2008 12:43 PM
New Comments to this post are disabled

About Linchi Shea

Checking out SQL Server via empirical data points

This Blog


Privacy Statement