THE SQL Server Blog Spot on the Web

Welcome to - The SQL Server blog spot on the web Sign in | |
in Search

Andy Leonard

Andy Leonard is a Data Philosopher at Andy Leonard Consulting, an SSIS Trainer, Consultant, and developer; a Business Intelligence Markup Language (Biml) developer and BimlHero; SQL Server database and data warehouse developer, community mentor, engineer, and farmer. He is a co-author of SQL Server Integration Services Design Patterns, and author of Managing Geeks - A Journey of Leading by Doing, and the Stairway to Integration Services.

  • VSTA Script Editor Not Displaying in VS2015 / SSIS 2016


    I probably did something dumb to cause this, but on the off chance it’s not (just) me I thought I would blog about it.

    I’ve been tinkering with SSIS 2016 in SQL Server 2016 CTP3. I set up a VM running Windows Server 2012 and installed the CTP along with SQL Server Data Tools for Visual Studio 2015 (October release).

    After installing and configuring, I built an SSIS Catalog and poked around for changes (more on that in another post… or series of posts…). Then I began building SSIS packages to test stuff (see previous parenthetical comment). Except… I couldn’t get the VSTA code editor to open for my Script Tasks. It didn’t matter which language I chose, clicking the Edit Script button produced no code editor.


    Someone’s probably found this and fixed it by now. But if they haven’t, and if you run into this issue, here’s how I resolved it:

    1. I loaded the CTP3 iso file into the VM’s optical drive:


    2. I browsed to the redist\VSTA\14.0 folder on the installation media and executed the vsta_setup.exe file:


    3. When prompted, I selected the Repair button:


    The application executed for about a minute and VOILA! My VSTA editors stated working:


    w00t! Now, back to more testing…


  • #sqlcares

    After a couple long naps Saturday and Sunday, I’m almost caught up on sleep from last week. Ok, not really. But we were out really late most nights last week. As I told some people, “It has been dark for a couple hours already!”

    The PASS Summit is always a time for #SQLFamily to come together. Tom LaRock (Blog | @SqlRockStar) mentioned this in his address to open the event Wednesday morning, stating that we are, in fact, a family. The Summit is a lot like an annual family reunion.

    One of our members had a particularly challenging week. Mike Walsh (Professional Blog | Personal Blog | @Mike_Walsh) was diagnosed with Multiple Sclerosis the week before the PASS Summit 2015 started. Buck Woody (Blog | @BuckWoodyMSFT) posted a great idea on his blog, titled #sqlcares. I encourage everyone to go read Buck’s post and help out in any way you can.

    I encourage my fellow SQL Server Community bloggers to join Buck and help spread the word. Please use the #sqlcares hashtag Buck started.


  • #Biml at the PASS Summit 2015

    I’m excited that PASS included Biml in the Summit 2015 this year. It’s the first year for this topic, at least officially. Unofficially, Biml topics have been presented the past couple years. The technology is maturing and receiving the exposure it so richly deserves.

    I’m honored be presenting Use Biml to Automate SSIS Design Patterns [BIA-401] Thursday 29 Oct at 3:15 PM PT in Room 3AB:

    Perhaps you’ve heard the buzz about Business Intelligence Markup Language (Biml) and wondered, “Why is everyone so excited about Biml?” Occasionally, a new technology emerges that changes everything. For SSIS developers, Biml is one such technology. Business Intelligence Markup Language (Biml) is a powerful solution for rapidly creating SSIS packages and solutions. SSIS Design Patterns support various load profiles. Combined, Biml and SSIS Design Patterns offer a compelling solution for automating enterprise data integration.

    In this session, you will learn:
    -    How to rapidly build multiple SSIS packages using Biml
    -    How to apply different SSIS Design Patterns using Biml
    -    How to automate SSIS Design Patterns using Biml

    I feel like I’ve been working on the demos for this session for five years, and I’m stoked to share what I’ve learned with you! I hope to see you at the PASS Summit 2015!


  • Browsing the Catalog with SSIS Catalog Compare

    One neat feature of SSIS Catalog Compare is the Catalog Browser. It’s a TreeView control that connects to the SSIS Catalog and displays handy information for Data Integration Administrators and Release Configurations people working in a DevOps enterprise.

    The Integration Services Catalogs node in SQL Server Management Studio’s (SSMS) Object Explorer provides this view of an SSIS Project in the Catalog:

    Viewing the SSISDB\FrameworkTestFolder\ProtectionLevelTests SSIS Project in SSMS
    Viewing the SSISDB\FrameworkTestFolder\ProtectionLevelTests SSIS Project in SSMS

    There’s nothing wrong with this view. But there’s a lot more to the configuration or the SSISDB\FrameworkTestFolder\ProtectionLevelTests SSIS Project than what’s visible at a glance in the SSMS Object Explorer’s Integration Services Catalogs node.

    Here’s the same SSIS Project (in the same SSIS Catalog) in SSIS Catalog Compare (please click the image to view enlarged in a separate window):

    Viewing the SSISDB\FrameworkTestFolder\ProtectionLevelTests SSIS Project in SSIS Catalog Compare
    Viewing the SSISDB\FrameworkTestFolder\ProtectionLevelTests SSIS Project in SSIS Catalog Compare

    SSIS Catalog Compare provides more information about the SSISDB\FrameworkTestFolder\ProtectionLevelTests SSIS Project. Everything displayed by the Integration Services Catalogs node in SSMS is visible. In addition, you can view:

    • Project Parameters
      • including Connection Manager metadata, Environment Variable mappings, and run-time overrides
    • Project References to SSIS Catalog Environments
      • including Environment Variable mappings and run-time overrides
    • Package Parameters
      • including Environment Variable mappings and run-time overrides
    • In the Environments node, the Environment Variables and their values are listed.
      • Sensitive values are masked.

    Visibility into these configuration settings – shown in the context of the SSIS Catalog – provides a rich experience for individuals who manage or monitor the Enterprise Data Integration Lifecycle.

    When you compare the projects, the differences between SSIS Catalog instances are highlighted as shown here (please click the image to view enlarged in a separate window):

    Comparing the SSISDB\FrameworkTestFolder\ProtectionLevelTests SSIS Project with Another SSIS Catalog Instance
    Comparing the SSISDB\FrameworkTestFolder\ProtectionLevelTests SSIS Project with the Same Project in Another SSIS Catalog Instance

    Project Parameter and Package Parameter highlighting indicates one missing reference mapping (the Debug Package Parameter) and differences in the Project Connection Manager reference mappings (the ConnectionString and Password connection manager properties are not mapped for the SSIS Project in the second SSIS Catalog).

    Note the Catalog Environment named ProtectionLevelEnv exists in both Catalog instances, but it contains no Catalog Environment Variables in the vmSQL14\LT Catalog (shown on the right). If the vmSQL14 Catalog (shown on the left) represented a Test of Development instance and the vmSQL14\LT Catalog represented a Production instance, the SSISDB\FrameworkTestFolder\ProtectionLevelTests SSIS Project could appear – at a glance – to be configured

    Using image-editing software, I created a similar-looking side-by-side comparison of the SSISDB\FrameworkTestFolder\ProtectionLevelTests SSIS Project in the two Catalog instances shown above in SSIS Catalog Compare. The SSIS Catalogs show some of the differences between the SSIS Projects, but not all:

    Side-by-Side Comparison of Two Instances of the SSISDB\FrameworkTestFolder\ProtectionLevelTests SSIS Project
    Side-by-Side Comparison of Two Instances of the SSISDB\FrameworkTestFolder\ProtectionLevelTests SSIS Project

    SSIS Catalog Compare improves enterprise data integration management by facilitating a fast and accurate comparison of SSIS Projects in different SSIS Catalog instances.

    Want to Learn More?

    Join me and RedGate Software for the SSIS Catalog Compare webinar 3 Nov 2015 at noon ET.

    Register today!

    Buy the Beta Version 

    For a limited time, the beta version is available for only $95 USD. If you purchase this version of SSIS Catalog Compare, you will receive a non-expiring, perpetual license for the lifetime of the product. That means free upgrades for as long as SSIS Catalog Compare exists!

    More details here... 


  • SSIS Catalog Compare

    It’s time to deploy that SSIS project from Development to Test, or from User Acceptance Testing to Production. Maybe it’s the very first time you’re deploying this project, maybe it’s the deployment of an upgrade or bug fix. Maybe you’re part of a team of data integration developers and you’re not sure if another team member has already deployed the upgrade.

    What then?

    You open SQL Server Management Studio (SSMS), expand both Integration Services Catalogs nodes, and visually inspect the deployments, noting differences. You ask yourself:

    • Which packages are deployed to UAT but missing from Production?
    • Which packages are deployed to Production but missing from UAT?
    • Are there differences between Catalog Environments in UAT and Production?
    • Are there differences between Catalog Environment Variables in UAT and Production?

    There are lots of ways to answer these questions, but they all involve more work than they should.

    Until now.

    SSIS Catalog Compare


    Developed by Kevin Hazzard (blog | @KevinHazzard | DevJourney) and Andy Leonard, SSIS Catalog Compare allows developers, architects, and release managers to visualize what’s present and missing from SSIS Catalogs. This is helpful at the time of deployment (or audit), but the SSIS Catalog Compare utility does more (much more!) than this.

    Intrigued? Want to learn more? Join me and RedGate Software for the SSIS Catalog Compare webinar 3 Nov 2015 at noon ET.

    Register today!

    Buy the Beta Version 

    For a limited time, the beta version is available for only $95 USD. If you purchase this version of SSIS Catalog Compare, you will receive a non-expiring, perpetual license for the lifetime of the product. That means free upgrades for as long as SSIS Catalog Compare exists!

    More details here... 


  • Relaunch!

    The website has been redesigned and relaunched by the good people at Varigence!

    The new site includes learning paths and tests. It’s almost as awesome as Biml.

    Go check it out today!


    Bonus – check out the new BimlScript commercial!

  • Seeing Sharply

    VS2015My technology career began as a curiosity in 1975. I was 11 years old and living next door to an elderly couple. Their son returned from an engineering career in the Air Force and began tinkering with a microprocessor kit. I was curious and he taught me Motorola 6800 machine language. After that, 40 years ago at the time of this writing, he taught me BASIC.

    I stuck with BASIC through GWBasic and other versions. When Visual Basic (VB) was released, I dove into the language, relishing the ability to develop Windows programs in a familiar syntax. About the time .Net came out I made the leap to database technology. Lots of my VB friends made the leap to C#. I tried – even writing the demos for my second book project (Professional Software Testing with Visual Studio 2005 Team System) in C#. But for some reason, my brain refused to grasp C#.

    And then I began working with Kevin Hazzard.

    Kevin is inspirational. And cool. And funny. And intelligent. And cool. And mostly patient, which benefited me greatly. When he found himself dropped into the middle of an SSIS project and needing a little guidance, I took the opportunity to swap him C# tutoring for SSIS tutoring. It worked out well, and I have been developing with C# for several months now. I have a long way to go before I will venture to call myself a C# developer (and it may never happen). But I am comfortable with the language, and that’s something I could not say (or write) a year ago.

    Thanks, Kevin, for helping me take another step on my DevJourney.


  • SQL Server 2016 Temporal Tables and Type II Data Warehouse Dimensions

    SQL Server 2016 introduces temporal tables, an interesting and “data warehouse-y” addition to the relational database.

    So how do temporal tables work? They add a couple columns to a table that are used to track the date a row is inserted or last updated, based on the primary key. Suppose you have a row that looks like this:


    If you begin tracking this data temporally, the row could appear similar to this:


    Note the dateTime2 data type used for the date-range fields. A ToDate value of 9999-12-31 23:59:59.9 indicates this row is “current.” The value in the FromDate field indicates when the row was inserted or last updated, whichever happened last.

    How did we get here? Let’s build out a demo. But first, a shout-out to Damian Widera, who’s SQLBlog post titled “Temporal table behind the scenes - part 1” raised my interest in temporal tables.


    You can begin by creating a database. I created TemporalTests using this following statement:

    If Not Exists(Select name
              From sys.databases
              Where name = 'TemporalTests')
           Create Database TemporalTests

    Next, I create a table named TestTable in TemporalTests:

    Use TemporalTests

    If Not Exists(Select name
                  From sys.tables
                   Where name = 'TestTable')
      Create Table dbo.TestTable
           Id Int Identity(1,1)
            Constraint PK_TestTable Primary Key,
           Name VarChar(20),
           Value Int,
           FromDate DateTime2(1) Generated Always As Row Start Not NULL,
           ToDate DateTime2(1) Generated Always As Row End Not NULL,
           Period For System_Time (FromDate, ToDate)
      ) With (System_Versioning = ON)

    I won’t go into all the details of the syntax, but I will point out three requirements:

    1. The table needs a primary key. This serves to identify the row and, I believe, should not be updated. This makes the primary key function a lot like a surrogate key in a data warehouse dimension.
    2. The table needs two DateTime2 fields configured similar to those shown (FromDate and ToDate) and a clause identifying these fields as the temporal value columns - Period For System_Time (FromDate, ToDate).
    3. The table needs the key phrase With (System_Versioning = ON) at the end of the Create Table statement (when creating a temporal table).

    When created, the table in Object Browser appears as shown:


    Note you can supply a name for the temporal system table, but if you don’t supply a name SQL Server will create one for you.

    Let’s load some data!

    Insert Into dbo.TestTable
       (Name, Value)
       ('A', 0)
    , ('B', 1)
    , ('C', 2)
    , ('D', 3)
    , ('E', 4)

    If we query the table, we’ll see all the rows have the same FromDate and ToDate:


    Now, if we update a few rows:

    Update dbo.TestTable
    Set Value = Value * 10
    Where Value > 2

    The table now has some rows – the last two – with different FromDate values:


    We can now exercise a Type-II-like dimension query to examine the state of the rows at some time before the latest update:

    Select *
    From dbo.TestTable
    For SYSTEM_TIME As Of '2015-09-18 22:13:00.0'

    The results? The values as they were at the time specified in the For SYSTEM_TIME As Of '2015-09-18 22:13:00.0' clause:


    Cool? I thought so too.

    I still have unanswered questions about temporal tables for Type-II dimension tracking. The largest question regards performance at scale. We’ll see once we get out of the SQL Server 2016 CTPs…


  • Presenting at SQL Saturday 431–Spartanburg 26 Sep!

    I am honored to present two sessions at SQL Saturday #431 in Spartanburg, SC on 26 Sep 2015! I will be delivering:

    SQL Saturday #431 also features two stellar precons: Transforming and Visualizing Data with Power BI by Patrick LeBlanc and Getting the New DBA Up to Speed with Performance Tuning by Ed Watson and Mike Lawell.

    There are lots of awesome speakers at SQL Saturday #431 – check out the schedule for more information.

    Register and, if you read this blog, introduce yourself!


  • Opinions Change

    Dewey did *NOT* defeat Truman!

    Opinions change, and they should change as more and better evidence surfaces. There are words for this: “learning” and “growing” are but two of them.

    Communication plays a key role in changing opinions. I’ve been studying communication recently and I’ve decided it’s a miracle anyone is ever able to communicate with anyone else. There are several reasons; one reason is we’re wrong more than we are right.

    On Being Wrong

    In her TED talk Kathryn Schulz, "Wrongologist," states “We misunderstand signs around us.” "I can't actually think about anything I'm wrong about. At least not in the present tense." "So effectively we all wind up… traveling through life… trapped in this little bubble of feeling very right about everything." "How does it feel - emotionally - how does it feel to be wrong? How does it feel to realize you're wrong?"

    Her main point is: "Just being wrong doesn't feel like anything."

    Kathryn goes on to explain: "The way to succeed in life is to never make any mistakes." "...getting something wrong means there's something wrong with us."

    In response:

    1. We assume people are ignorant.
    2. We assume people are idiots.
    3. We assume people are evil.

    "This attachment to our own rightness keeps us from preventing mistakes when we absolutely need to, and causes us to treat each other terribly." "The miracle of your mind isn't that you can see the world as it is. It's that you can see the world as it isn't."

    Her (Brilliant, in my opinion) Conclusion:

    "It does feel like something to be wrong. It feels like being right."

    I encourage you to take a few minutes to listen to the TED talk. It’s awesome, as is Kathryn’s book:



    Learn more:


  • While You Have It…

    I remind my children who live at home - the "junior engineers" - to deal with the things they have in their hands while they have them in their hands. Why? Because it’s inefficient to put something down only to pick it up later and put it where it belongs. Like all rules, there are exceptions to this.

    I also tell them to spend an extra 30 seconds each evening putting their shoes in the same place. When they forget to do this, they spend 10 minutes the next morning searching for their shoes. A little planning goes a long way.

    Deal with stuff while you have your hands on it, while it’s in front of you. And get it done.


    Learn more:


  • Presenting SSIS 2014 Data Flow Tuning Tips and Tricks at Triad PASS BI 25 Aug 2015!

    I am honored to present SSIS 2014 Data Flow Tuning Tips and Tricks to the Triad PASS BI chapter in Winston-Salem 25 Aug 2015.

    Do you want SSIS to go fast? This session is for you! Attend and learn techniques for developing, instrumenting, monitoring, and managing SSIS 2014 Data Flow performance in your data integration enterprise.
    Session Level: Beginner-Intermediate

    Register today!


  • SSIS Catalog Logging and Reporting in the Enterprise

    The SSIS Catalog, first introduced with the release of SQL Server 2012 Integration Services (SSIS 2012), is a big step forward in Data Integration Lifecycle Management (DILM) with SQL Server Integration Services (SSIS). Like all software, the SSIS Catalog performs some tasks admirably and other tasks… not so admirably. In this post, I will share some thoughts on SSIS Catalog logging.

    The Good


    My favorite SSIS Catalog feature is the integrated logging. It Just Works. I configure a default Logging Level when I create an SSIS Catalog (or later) and I don’t have to think about logging unless or until I want to change the Logging Level for some reason. And I can make that change on a per-package-execution basis, even. SSIS Catalog logging is fast and requires little or no work on my part to maintain. Other logging defaults configured at the time of the Catalog creation (or later) include the Retention Period and Clean Logs Periodically properties:


    I can run my data integration enterprise with these defaults. And I sometimes do.

    The OK

    Time-Deterministic Logging

    The SSIS Catalog will log some events in an order that is different from the order in which the events occurred in real time. In my experience, it’s possible to see events that occur within a few milliseconds logged out of order. I’ve explained this to a few auditors and, to date, none have disqualified a system because of this behavior.

    The Bad

    Log Maintenance

    The way the SSIS (in 2012 and 2014, at least) Catalog cleans these records can be troublesome. I won’t bore you with the technical details. I will just tell you that if you have lots of execution and event metadata stored in the SSIS Catalog log tables, the log cleaning process can take a long time and consume lots of server resources.


    This is potentially Ugly, but it is at least Bad. In order for someone to view the SSIS Catalog reports that ship with the solution (in SSIS 2012 and 2014), the following must be true:

    • The user must have SQL Server Management Studio (SSMS) installed.
    • The user must be a member of the ssis_admin or sysadmin roles.

    The reports are excellent:


    I have no issue with Database Administrators (DBAs), Database Developers, or Data Integration Developers using SSMS and the SSIS Catalog reports. I worry some about Operations people and project stakeholders needing SSMS and elevated permissions just so they can access operational or historical enterprise data integration information. Am I saying I do not want Operations people and project stakeholders to see the data contained in the SSIS Catalog reports? No, I most certainly am not. My concern is that someone will accidentally delete a mission-critical SSIS package from a Production SSIS Catalog. (Note: DBAs, Database Developers, and Data Integration Developers are just as at risk of accidentally deleting something as Operations people or project stakeholders.)

    You see, if you are a member of either the ssis_admin or sysadmin role, you have permission to delete SSIS packages. You also have permission to do a whole lot of other things that, frankly, you shouldn’t need to do.

    The Ugly

    The defaults for SSIS Catalog logging properties are shown in the screenshot above. The default logging level is Basic. I like that setting. I also like cleaning the logs periodically.

    I cringe a little at the default retention period. Why? Experience. I’ve been an enterprise data integration architect for years. Doing that job will teach you to think about stuff normal people don’t consider. Here’s why I cringe at a 365-day default retention period: What happens 1 Jan? Most enterprises run mission-critical, regulation-required, end-of-year reports on or around 1 Jan. These processes may only execute once per year (but they’re tested each time a change is deployed, right? Don’t get me started…). They can be big and complex processes and they usually execute across a larger domain of data than nightly, weekly, monthly, or quarterly processing.

    What does that mean for the SSIS Catalog logs? More messages!

    Wait, it gets worse.

    Where are your support people on or around 1 Jan? Sitting at their desks watching these processes run to completion? Gosh, I hope not. Let’s say you manage to make it through that first SSIS-Catalog-managed year-end processing cycle without calling in too many people. Good. Fast-forward one year – or 365 days, as the Logging Retention Period property of the SSIS Catalog likes to call it. The default logging retention period also kicks in on or around 1 Jan (you think you have a 25% chance of hitting a leap year, but Murphy is way better at statistics than you or me…), right about the time your extra long data integration processing is trying to run. That smell coming from your server? That’s smoke.



    I can hear you thinking, “Great. Now what, Andy?” I’m glad you asked! Ask yourself, “Self, do we really need a year of historical enterprise data integration logging?”

    If the answer is “Yes, we need at least a year of log data,”  then I recommend you increase the Retention Period (days) property to some number greater than 365 – maybe 380, provided fifteen days will get your enterprise beyond the year-end process. The goal is to have the log maintenance from last year’s year-end processing occur sometime after this year’s year-end processing.

    If the answer is “No, we don’t need a year a log data,” I recommend you set the Retention Period (days) property to the number of days you need to retain.


    If you’re reading this and you have an SSIS Catalog in your enterprise, then you also have SQL Server Reporting Services (SSRS). You can write your own reports to execute against the logs and data integration metadata stored in the SSISDB database. SSRS reports can be deployed to run in a web browser. That makes them read-only from a user perspective.

    To be fair, the SSIS Catalog reports built into SSMS are also read-only. But SSMS is anything but read-only. Good DBAs will read this and (rightfully) think, “Well, enterprise security should define who has permissions to read and write and make changes to data in SSMS.” I concur, and that’s almost correct. The SSIS Catalog node in the SSMS Object Browser is different. Permissions checks are actually hard-coded into some of the SSIS Catalog Views, even. In order to view the reports, they must be a member of either ssis_admin or sysadmin.

    To also be fair, the user still must be a member of either ssis_admin or sysadmin to view data via SSIS Catalog Views from an SSRS solution deployed so that it runs in a web browser. Two thoughts:

    1. It’s a web browser and the reports are (hopefully) only executing SELECT statements.
    2. If you design your own reporting solution in SSRS, you can define how you access the data. You do not need to use the SSIS Catalog-supplied views with their hard-coded role checks, you can write your own views or stored procedures.

    I and others have built custom SSRS reporting solutions that read the same SSIS Catalog execution logs and metadata. At Linchpin People, we are here to help™.


    The SSIS Catalog contains a great set of Data Integration Lifecycle Management tools, reports, and functionality. The logging and reporting is very good, but there are things of which you should be aware.


    I am here to help:
    Contact Andy!

    Learn more:
    SSIS Design Patterns training in London, UK, 7-10 Sep 2015    
    Advanced SSIS Training with Andy Leonard & Tim Mitchell in Reston Virginia Dec 7 - 10, 2015

    DevOps and Data Integration with SSIS (at IT/Dev Connections in Las Vegas, NV 15 Sep 2015)
    Designing an SSIS Framework (at IT/Dev Connections in Las Vegas, NV 16 Sep 2015)
    Using Biml to Automate SSIS Design Patterns (at the PASS Summit 2015 in Seattle, WA 27-30 Oct 2015)
    Hacking the SSIS Catalog (at the PASS Summit 2015 in Seattle, WA 27-30 Oct 2015)

    Stairway to Integration Services
    Linchpin People Blog: SSIS
    Stairway to Biml

    SSIS Design Patterns

  • SSIS Design Pattern: Access Parent Variables from a Child Package in the SSIS Catalog

    I am regularly asked about passing variable values between SSIS packages.

    Tim Mitchell (Blog | @Tim_Mitchell) wrote an excellent Notes from the Field article for the SQL Authority blog called SSIS Parameters in Parent-Child ETL Architectures. Before the first version of the SSIS Catalog was even released, Jamie Thomson blogged about The new Execute Package Task in SSIS in Denali and Eric Johnson blogged about Calling Child Packages in SSIS here at SQLBlog. In pre-SSIS Catalog days, I blogged about this topic at SQLBlog: Accessing Parent Package Variables In Child Packages... Without Configurations. There are many other posts related to accessing variable values in SSIS.

    This topic has received a fair amount of treatment.

    “Why are you writing about this again, Andy?”

    I’m glad you asked! I’ve learned something new about the behavior I described in my 2007 post. I’ve discovered a difference in the way SSIS parent-child package variable values interact in the SSIS Catalog and in SSDT-BI – the designer.

    Disclaimer: To be fair, the behavior I described in 2007 was not a design feature of SSIS. It worked, but it was (and is) a by-product of the way the original Execute Package Task was designed. I was given fair warning by the SSIS team at Microsoft that there were no guarantees this functionality would exist in future versions of the product.

    Another disclaimer: I am not picking on Microsoft! My intent in writing this is to inform those who read my 2007 post that this functionality still exists but is reported differently than some might expect, and to give you some sense of how things are different. Cool? K. Let’s move forward.

    Show and Tell

    Let’s build a demo SSIS project. I called mine ParentChildTests and added two SSIS packages named EPTParent.dtsx and Pkg1.dtsx:


    In the EPTParent.dtsx SSIS package, I add a package-scoped string variable named ParentVariable (creative name, I know!) with some default value:


    In the Pkg1.dtsx SSIS package, I add a Script Task configured to read the ReadOnlyVariables User::ParentVariable and System::PackageName. The Script Task contains the following code:

                            string packageName = Dts.Variables["System::PackageName"].Value.ToString();
                string msg = "I am " + packageName;
                bool fireAgain = true;

                Dts.Events.FireInformation(1001, "Script", msg, "", 0, ref fireAgain);

                string parentVariable = Dts.Variables["User::ParentVariable"].Value.ToString();
                msg = "Parent Variable: " + parentVariable;

                Dts.Events.FireInformation(1001, "Script", msg, "", 0, ref fireAgain);

    The purpose of the Script Task is to generate a couple log messages by raising Information events. Please note: there is no variable in Pkg1.dtsx named ParentVariable! If you create a variable named ParentVariable in Pkg1.dtsx, this demo will not work. (You can validate the script before closing the VSTA window. I show you how here!)

    You can test-execute Pkg1.dtsx to see if it fails (it should fail):


    “Wait, what, Andy? You’re teaching us how to write SSIS packages that fail?” Yes. Yes, I am. On purpose, even. Why? Because you are going to write SSIS packages that fail. Teaching you anything else would be disingenuous. If failure bothers you, you may want to consider a different line of work.

    The key here is the package should fail and return the error:

    Failed to lock variable “User::ParentVariable” for read access…

    If you see this error, you’ve done everything I described correctly (so far).

    This error means there is a caveat: You cannot successfully execute this package stand-alone; to succeed, this SSIS package must be executed via the parent SSIS package (EPTParent.dtsx in this case).

    In the EPTParent.dtsx SSIS package, I add an Execute Package Task configured to execute Pkg1.dtsx:


    Before we test-execute EPTParent.dtsx in SQL Server Data Tools – Business Intelligence (SSDT-BI), I want to explain what I expect to happen. I expect EPTParent.dtsx to execute. I expect the Execute Package Task in EPTParent.dtsx to execute and, in turn, execute the Pkg1.dtsx SSIS package. I expect Pkg1.dtsx to execute – successfully this time – leveraging and undocumented (and unsupported) feature in the SSIS Execute Package Task that allows child packages to access parent package variables.

    Now you may be thinking, “Big deal, Andy. I can just use the Parent Package Variable Package Configuration to read the value of a parent package in a child package.” And you would be correct in your thinking. But I was very careful about the words I wrote just now, and I wrote “access.” Why didn’t I just write “read” since that’s all we’re doing here? Because it’s possible, using this very functionality, to also write a value from a child package to a parent package. That’s not possible using a Parent Package Variable Package Configuration.

    Taking a look at the Progress / Execution Results tab of Pkg1.dtsx, we see exactly what I thought I would see – a successful execution and an information message reporting the value of ParentVariable:


    “Why Are You Telling Me This?”

    Again, I’m glad you asked. So far, all I’ve showed you is what I wrote about in 2007.

    Here’s what’s changed. If I deploy this project to the SSIS Catalog execute it, and view the Catalog reports, I will not see two package executions. Instead I will see a single package execution for EPTParent.dtsx:


    The Execution Overview table on the Overview report provides some clarity – especially the Task Name column:


    Clicking the View Messages link shows even more detail, and the last few messages (sorted  as last-occurred, first-listed) demonstrate Pkg1’s Script Task executed “inside” Pkg1, and provides some evidence that Pkg1 executed “inside” the Execute Package Task in the EPTParent.dtsx SSIS package:



    Is this merely trivial or anecdotal? As with so many other questions in the data field, the answer is “it depends.”

    If you need (or merely desire) visibility into your enterprise data integration lifecycle, my answer would be “no, this is not trivial or anecdotal.” Child package executions – when the child is executed via the Execute Package Task – are hidden from the “All Executions” report view. An SSIS package executed and it doesn’t show up. If you understand why it doesn’t show up, this is no big deal. But the report is not titled “Most Executions,” is it?

    “Why does this happen?”

    I go into more detail about this behavior in my article Advanced Event Behavior – Level 10 of the Stairway to Integration Services – part of the Stairway to SSIS series at SQL Server Central. That article includes this image (Figure 43), my attempt at constructing a visual representation that SSIS will never produce in real life. A Parent package – named Parent.dtsx – executes. It contains an Execute Package Task that executes a package named Precedence.dtsx. To construct this image, I used two screenshots of the Package Explorer tab – one from Parent.dtsx and one from Precedence.dtsx:


    This is one representation of what happens when a child SSIS package is executed from a parent SSIS package. It shows the child SSIS package running “in scope” of the Execute Package Task, and I believe that is an accurate representation. And I think that’s why we do not see the child package executions listed on the All Executions SSIS Catalog report.


    Learn more:
    SSIS Design Patterns training in London, UK, 7-10 Sep 2015    
    Advanced SSIS Training with Andy Leonard & Tim Mitchell in Reston Virginia Dec 7 - 10, 2015

    Stairway to Integration Services
    Linchpin People Blog: SSIS
    Stairway to Biml

    SSIS Design Patterns

  • What Customers Want

    Not once, in my combined years of direct employment by enterprises or consulting, has a customer contacted me about an issue about the grain of their date dimension in their enterprise data warehouse. Not one single time. I haven’t been contacted about cardinality either. Or trouble with execution threading models. Nope. Not ever.

    Instead, I’ve been contacted about other stuff. Stuff like the accuracy and “drill-ability” of a report and how long it takes to load data and how up-to-date the data is. Those are the concerns customers tell me they want addressed.

    Often the solution to the problem they are trying to solve involves data warehouse grain or cardinality or threading.

    Are my customers lying to me? Absolutely not! Part of my job is to comprehend what my customers perceive, to find out what they believe is the problem they’re trying to solve. “This report takes too long to display,” is a valid description of a problem that needs solving. You don’t need to know what a tuple is to help. You don’t need to understand how indexes traverse b-trees to get what they’re telling you.  You don’t even need a degree in computer science to understand what they’re saying: The report is slow.

    All you need to do is listen.

    What customers want is your help.


    Learn more:


This Blog



My Company

Community Awards

Friend of Red Gate

Contact Me


Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement