THE SQL Server Blog Spot on the Web

Welcome to - The SQL Server blog spot on the web Sign in | |
in Search

Andy Leonard

Andy Leonard is a Data Philosopher at Enterprise Data & Analytics, an SSIS Trainer, Consultant, and developer; a Business Intelligence Markup Language (Biml) developer and BimlHero; SQL Server database and data warehouse developer, community mentor, engineer, and farmer. He is a co-author of SQL Server Integration Services Design Patterns, and author of Managing Geeks - A Journey of Leading by Doing, and the Stairway to Integration Services.


    During a delivery of Advanced SSIS Training (by Tim Mitchell and Andy Leonard), Michael Smith suggested an idea for a site where SSIS performance benchmarks could be created and recorded. Thus, was born. I was so struck by the idea; I purchased the domain while Tim was delivering a couple modules on Advanced SSIS!

    The site is designed by an engineer for engineers. The idea? Create repeatable tests that can be executed by data integration developers who use SSIS (SQL Server Integration Services) along with a site to store and present results for comparison.

    I created a test SSIS project that loads National Provider Identification (NPI) data known as National Plan and Provider Enumeration System (NPPES). NPPES is collected and maintained by the US Centers for Medicare and Medicaid Services (CMS) and is a list of US healthcare providers. How can you use this?

    You can find the test procedure here. It includes links to the source data and the SSIS project named NPIData. The test project contains an SSIS package that creates a database and the two tables required for testing purposes. Instructions are included in the test procedure. Please let me know if you have any questions.

    You can view my test conditions and results here.

    I welcome your feedback.


  • ETL Instrumentation in 2016

    A long time ago in a galaxy far, far away… I wrote about ETL Instrumentation.

    Ok, it wasn’t that long ago, and it wasn’t far away. But things have changed some since I wrote that series, and I would like to spend some time updating the material. I’m typing this post at the end of 2015, so I’m using SQL Server 2016 CTP 3.2 running on Windows Server 2016 Technical Preview 4. The SSDT-BI IDE I’m using is the December preview running in Visual Studio 2015. I downloaded the AdventureWorks 2016 CTP 3 Sample database scripts from Codeplex. Finally (or first…), I build VMs for testing purposes and I believe you should too. You can spin up a VM in Azure pretty quickly these days. To do so, start at the Azure Portal. For local instances, I use Oracle VirtualBox to build VMs. You can also use VMWare or Hyper-V for local virtualization. Pick a hypervisor, local or in the cloud, and build a VM.

    As in the original article, lets begin by creating a database to host our instrumented data:

    use master

    print 'SSISRunTimeMetrics database'
    if not exists(select name
                  from master.sys.databases
                  where name = 'SSISRunTimeMetrics')
      print ' - Creating SSISRunTimeMetrics database'
      Create Database SSISRunTimeMetrics
      print ' - SSISRunTimeMetrics database created'
    print ' - SSISRunTimeMetrics database already exists.'

    Next, let’s build a schema and a table to hold our metrics:

    -- create ssis schema...
    print ''
    print 'ssis Schema'
    if not exists(select name
                  from sys.schemas
                  where name = 'ssis')
      print ' - Creating ssis Schema'
      set @sql = 'Create Schema ssis'
      print ' - ssis Schema created'
      print ' - ssis Schema already exists.'

    -- create RunTimeMetrics table...
    print ''
    print 'ssis.RunTimeMetrics Table'
    if not exists(select + '.' +
                  from sys.tables t
                  inner join sys.schemas s on s.schema_id = t.schema_id
                  where = 'RunTimeMetrics'
                   and = 'ssis')
      print ' - Create ssis.RunTimeMetrics Table'
      Create Table ssis.RunTimeMetrics
      (id int identity(1,1)
      ,packageStartDateTime datetime null
      ,packageEndDateTime datetime null
      ,packageName varchar(255) null
      ,packageVersion varchar(25) null
      ,packageStatus varchar(25) null)
      print ' - ssis.RunTimeMetrics Table created'
    print ' - ssis.RunTimeMetrics Table already exists.'

    We’ll depart from the original article here – but just a little – and create a stored procedure named usp_RecordManualPackageStart in the ssis schema:

    use SSISRunTimeMetrics

    print ''
    print 'ssis.usp_RecordManualPackageStart Stored Procedure'
    if exists(select + '.' +
              from sys.procedures p
              inner join sys.schemas s on s.schema_id = p.schema_id
              where = 'usp_RecordManualPackageStart'
               and = 'ssis')
      print ' - Dropping ssis.usp_RecordManualPackageStart Stored Procedure'
      Drop Procedure ssis.usp_RecordManualPackageStart
      print ' - ssis.usp_RecordManualPackageStart Stored Procedure dropped'
    print ' - Creating ssis.usp_RecordManualPackageStart Stored Procedure'

    Create Procedure ssis.usp_RecordManualPackageStart
      @packageStartDateTime datetime = null
    ,@packageName varchar(255) = null
    ,@packageVersion varchar(25) = null
      -- set @packageStartDateTime default...
      declare @Now datetime
      set @Now = GetDate()

      if (@packageStartDateTime Is Null)
       set @packageStartDateTime = @Now

      -- insert the run time metrics data...
      insert into ssis.RunTimeMetrics


    print ' - ssis.usp_RecordManualPackageStart Stored Procedure created'

    We make this change in order to provide support for SSIS packages executed from the file system or MSDB database which remain supported in SSIS 2016 (and for which I nod with approval to the Microsoft SSIS Developer Team. Options. Always give me options, please!).

    Building the SSIS Test Project

    Open SQL Server Data Tools – Business Intelligence (SSDT-BI) and create a new project. When the New Project dialog displays,select Business Intelligence Projects from the Project types list and Integration Services Project from the Templates. Name the project SSISRunTimeMetrics and click the OK button:


    One Glitch…

    When I click the OK button, I get an error that looks like this:


    To work around this error (which I suspect is a beta issue), I click OK and then right-click the solution in Solution Explorer. I then hover over “A” and click “Existing Project…”:


    I navigate to the solution folder and locate and select the project file:


    When I click the Open button, the project appears as part of the solution:


    Adding a new package is easy:


    After it’s added, I rename the new package to SSISRunTimeMetrics.dtsx:



    I still like to add annotations to my SSIS packages (although I would like to be able to change the color of the font…):


    I can't say enough about good documentation. You never know when someone else will have to open your source code to troubleshoot.

    PackageVersion: One Way

    Right-click in the white-space of the Control Flow canvas and click Variables. Add a variable named PackageVersion. Make sure the Scope is the SSISRunTimeMetrics package and make the Data Type String:


    Click the ellipsis for the Expression and add the following SSIS Expression Language to automate the variable value:

    (DT_WSTR,4) @[System::VersionMajor] + "." +
    (DT_WSTR,4) @[System::VersionMinor] + "." +
    (DT_WSTR,4) @[System::VersionBuild] + "." +
    (DT_WSTR,4) Year(@[System::CreationDate]) +
    ((Month(@[System::CreationDate]) < 10) ? "0" : "") +
    (DT_WSTR,2) Month(@[System::CreationDate]) +
    ((Day(@[System::CreationDate]) < 10) ? "0" : "") +
    (DT_WSTR,2) Day(@[System::CreationDate])


    Once entered, the variable window will display as shown here:


    I manually update the System::VersionMajor and System::VersionMinor variable values. System::VersionBuild is automatically incremented each time you save the package. I rarely parse the System::CreationDate as shown in this expression. I wish SSIS packages had a System::ModifiedDate property. If they did, I would parse that date as shown here. But they don’t, so I usually store the modified date in the System::VersionComments property manually, which alters my expression thus:

    (DT_WSTR,4) @[System::VersionMajor] + "." +
    (DT_WSTR,4) @[System::VersionMinor] + "." +
    (DT_WSTR,4) @[System::VersionBuild] + "." +

    I just have to remember to manually update the value of System::VersionComments in the package properties:


    I labeled this section “PackageVersion: One Way” because there are other ways to track the version of an SSIS package in SSIS 2016. The VersionGUID package property shown above is another way, and this property is read-only from the SSDT-BI IDE (although editable from the dtsx file). How you manage SSIS package versions is not nearly as important as tracking them somehow.

    Add a Container

    Add a Sequence Container and change the name to "SEQ Step 1 – Do Some Stuff":


    Sequence Containers are nice for several reasons:

    • They're aesthetic. You can break up chucks of functionality logically... how cool!
    • When troubleshooting, you can execute the contents of a container from the right-click context menu.
    • You can isolate chunks of functionality using transactions by container.

    Add an Execute SQL Task

    Next, add an Execute SQL Task and name it "SQL Log Manual Start of Package Execution":


    Double-click the Execute SQL Task to edit it. Change the ConnectionType property to ADO.Net. Click Connection and select "<New connection...>":


    When the Configure OLE DB Connection Manager dialog displays, click the "New..." button:


    I really like the “New Connection” functionality in SSIS. It picks the correct Provider for the option selected in the Execute SQL Task. Configure the connection to match where you built the SSISRunTimeMetrics database and then click the OK button:


    When you return to the Configure ADO.NET Connection Manager window, it should now display your connection in the Data Connections lists:


    Click OK to return to the Execute SQL Task Editor. Click the ellipsis in the SQLStatement property to open the Enter SQL Query dialog. Enter the name of the ssis.usp_RecordManualPackageStart stored procedure:


    Click the OK button to return to the Execute SQL Task Editor. Set the IsQueryStoredProcedure property (available for ADO.Net connection types) from False to True:


    What’s nice about using ADO.Net and the IsQueryStoredProcedure property is parameter configuration, which comes next. Click on the Parameter Mapping page:


    Click the Add button and add the following three parameter mappings:

    Variable Name: System::StartTime
    Direction: Input
    Data Type: DateTime
    Parameter Name: packageStartDateTime
    Parameter Size: –1

    Variable Name: System::PackageName
    Direction: Input
    Data Type: String
    Parameter Name: packageName
    Parameter Size: –1

    Variable Name: User::PackageVersion
    Direction: Input
    Data Type: String
    Parameter Name: packageVersion
    Parameter Size: –1


    Click the OK button to proceed.

    Add a Script Task

    Drag a Script Task into the sequence container and rename it “SCR Output Package Version”:


    Open the Script Task Editor and click the ellipsis in the ReadOnlyVariables property. Add the User::PackageVersion variable:


    Click the Edit Script button to open the VSTA Editor, and enter the following code in the public void Main() method:

    public void Main()
                string packageVersion = Dts.Variables["User::PackageVersion"].Value.ToString();
                string msg = "Version: " + packageVersion;
                bool fireAgain = true;

                Dts.Events.FireInformation(1001, "SSISRunTimeMetrics", msg, "", 0, ref fireAgain);

                Dts.TaskResult = (int)ScriptResults.Success;

    Close the VSTA Editor and click the OK button to close the Script Task Editor. Your SSIS package should appear as shown here:


    Let’s Test It!

    Execute the package in the SSDT-BI debugger. Your results should match those shown here (Success!):


    Click on the Progress tab to locate the Information message we generated in the Script Task:


    Viola! It works.

    Checking for Data

    Now let’s check the database to see if our message was logged. Open SSMS and connect to your SSISRunTimeMetrics database. Execute a query to check the contents of the SSISRunTimeMetrics table:

    select * from ssis.RunTimeMetrics


    This is a good sign. Our data is being logged and shows the package started execution.

    Logging Execution Completion

    It's time to add a procedure to log the end of package execution, but we have a problem: How do we identify the row we just added to the ssis.RunTimeMetrics table? Think about it. We just inserted a row, and since that's the only row in the table it's pretty simple to identify. Later, we will have lots of rows in the table in varying states. We need a method to positively identify the row we're working with.

    We could use the last row inserted - apply Max(id) criteria - but what happens as our metrics grows to include multiple packages? One package could start while another was executing and we'd update the wrong row.

    We require a means to positively identify the row when added to the table. Such a means exists. If we modify the ssis.usp_RecordManualPackageStart stored procedure we can have it return the value of the inserted ID. The OUTPUT clause gives us access to the inserted and deleted virtual tables. Modify the stored procedure by adding the following OUTPUT clause as shown:

    output as 'Inserted_ID'

    Execute the Transact-SQL to drop and recreate the ssis.usp_RecordManualPackageStart stored procedure:

    print 'ssis.usp_RecordManualPackageStart Stored Procedure'
    if exists(select + '.' +
              from sys.procedures p
              inner join sys.schemas s on s.schema_id = p.schema_id
              where = 'usp_RecordManualPackageStart'
               and = 'ssis')
      print ' - Dropping ssis.usp_RecordManualPackageStart Stored Procedure'
      Drop Procedure ssis.usp_RecordManualPackageStart
      print ' - ssis.usp_RecordManualPackageStart Stored Procedure dropped'
    print ' - Creating ssis.usp_RecordManualPackageStart Stored Procedure'

    Create Procedure ssis.usp_RecordManualPackageEnd
      @id int
    ,@packageStatus varchar(25) = ‘Succeeded’

      -- set @packageEndDateTime default...
      declare @packageEndDateTime datetime = GetDate()

      -- update the run time metrics data...
      update ssis.RunTimeMetrics
       Set packageEndDateTime = @packageEndDateTime
          ,packageStatus = @packageStatus
      where id = @id



    print ' - ssis.usp_RecordManualPackageStart Stored Procedure created'

    Add another Int32 Data Type variable named PackageInstanceId:


    Open the editor for the “SQL Log Manual Start of Package Execution” Execute SQL Task. Change the ResultSet property to “Single row”:


    Click the Result Set page and add a new resultset. Name it “0” (ordinals work well) and assign the results of this query to the PackageInstanceId variable you just created. This will push the results of the OUTPUT clause in the stored procedure (which will include the id of the newly inserted row) into the PackageInstanceId variable. You can then access it later in the package to update the precise row:


    Next, edit the Script Task by adding PackageInstanceId to the list of ReadOnlyVariables:


    Then edit the script in public void Main() to read:

    public void Main()
                string packageVersion = Dts.Variables["User::PackageVersion"].Value.ToString();
                int packageInstanceId = Convert.ToInt32(Dts.Variables["User::PackageInstanceId"].Value);
                string msg = "Version: " + packageVersion + " PackageInstanceId: " + packageInstanceId.ToString() ;
                bool fireAgain = true;

                Dts.Events.FireInformation(1001, "SSISRunTimeMetrics", msg, "", 0, ref fireAgain);

                Dts.TaskResult = (int)ScriptResults.Success;

    Next, add a new stored procedure named ssis.usp_RecordManualPackageEnd to the SSISRunTimeMetrics database using the following Transact-SQL script:
    use SSISRunTimeMetrics

    print ' - ssis.usp_RecordManualPackageStart Stored Procedure created'

    print ''
    print 'ssis.usp_RecordManualPackageEnd Stored Procedure'
    if exists(select + '.' +
              from sys.procedures p
              inner join sys.schemas s on s.schema_id = p.schema_id
              where = 'usp_RecordManualPackageEnd'
              and = 'ssis')
      print ' - Dropping ssis.usp_RecordManualPackageEnd Stored Procedure'
      Drop Procedure ssis.usp_RecordManualPackageEnd
      print ' - ssis.usp_RecordManualPackageEnd Stored Procedure dropped'
    print ' - ssis.usp_RecordManualPackageEnd Stored Procedure'

    Create Procedure ssis.usp_RecordManualPackageEnd
      @id int
    ,@packageStatus varchar(25) = 'Succeeded'

      -- set @packageEndDateTime default...
      declare @packageEndDateTime datetime = GetDate()

      -- update the run time metrics data...
      update ssis.RunTimeMetrics
       Set packageEndDateTime = @packageEndDateTime
          ,packageStatus = @packageStatus
      where id = @id



    print ' - ssis.usp_RecordManualPackageEnd Stored Procedure created'

    Now add a new Execute SQL Task named "SQL Log Manual End of Package Execution":


    Open the Execute SQL Task Editor and set the ConnectionType to ADO.Net, the Connection to your SSISRunTimeMetrics connection manager. Edit the SQLStatement property, changing it to ssis.usp_RecordManualPackageEnd. Set the IsQueryStoredProcedure property to True. Navigate to the Parameter Mapping page and add a mapping for PackageInstanceId as shown here:



    That’s all we need to conduct our next test, so let’s roll (as my friend Chris Yates [blog] says)!


    The package execution succeeds – that’s a good start. Let’s look at the Progress tab:


    Our information message tells us we created PackageInstanceId 2 – another good sign. Next, let’s look at the metadata in our database:


    w00t! Very cool, our metadata shows us when the package started and ended!


    “In an age of the SSIS Catalog, why would one ever employ this kind of metadata collection, Andy?” That’s a fair question. The SSIS Catalog is an awesome data integration execution, logging, and externalization engine. There are a handful of use cases, though, where enterprises may opt to continue to execute SSIS packages from the file system or the MSDB database. Perhaps the biggest reason to do so is that’s the way the enterprise is currently executing SSIS. When SSDT-BI converts pre-Catalog-era (2005, 2008, 2008 R2) SSIS packages to current, it imports these packages in a “Package Deployment Model” SSIS Project. This allows developers to upgrade the version of their SSIS project to SSIS 2016 (and enjoy many benefits for so doing) while continuing to execute SSIS packages in the file system. Kudos to the Microsoft SSIS Development Team for this backwards compatibility!

    If you do not have an SSIS Catalog collecting this kind of metadata for you, creating a database similar to SSISRunTimeMetrics is a valid solution. Yes, you have to manually add ETL instrumentation to your SSIS packages, but I believe the cost and overhead outweighs not having a solution for collecting this data integration execution metadata.

    As always, I welcome your thoughts on the matter!


  • SSIS Design Patterns and BIML: A Day of Intelligent Data Integration

    I’m honored to deliver a brand new precon (the PowerPoint slides still have that new slide smell) at SQL Saturday #477 – Atlanta – BI Edition! The precon is titled SSIS Design Patterns and Biml: A Day of Intelligent Data Integration.


    Join Andy Leonard, founder of Enterprise Data & Analytics, SSIS coach and trainer, author, and consultant, for a day of training focused on intelligent data integration. Andy is an experienced SSIS consultant, but he also led an enterprise team of 40 ETL developers during projects that spanned 2.5 years. And delivered.

    The target audience for this course is data integration developers and architects who want to learn more about SSIS performance, DevOps, execution, and automation.

    Attendees will learn:
    - a holistic approach to data integration design.
    - a methodology for enterprise data integration that spans development through operational support.
    - how automation changes everything. Including data integration with SSIS.

    Topics include:
    1. SSIS Design Patterns
       Data Flow Performance
       ETL Instrumentation
    2. Executing SSIS in the Enterprise
       The SSIS Catalog - the good, the bad, and the ugly.
    3. Custom SSIS Execution Frameworks
    4. DevOps and SSIS
       A (Condensed) Sprint in the Life of a Data Integration Solution
       Version Control and SSIS
    5. Business Intelligence Markup Language
       A Brief Introduction to Biml in BimlStudio
    6. SSIS Design Patterns + Biml
       Putting the pieces together.
    7. SSIS Design Patterns + Biml + Custom SSIS Execution Frameworks
       Executing the new combinations.
    8. SSIS Design Patterns + Biml + Custom SSIS Execution Frameworks => DevOps
       Enterprise-class data integration with SSIS.

    Register today!

    I’m also presenting Using BIML as an SSIS Design Patterns Engine at SQL Saturday #477 – Atlanta – BI Edition! Right after my presentation, Scott Currie (the guy who created Mist and BimlOnline, and who shepherds Biml) is presenting SSIS Unit and Integration Testing with the Open Source BEST project!

    As if that’s not enough, Patrick LeBlanc and Adam Saxton are also delivering a precon titled Microsoft BI in a Day. There are tons of really smart speakers lined up to present (plus me)! It’s going to be a great event!!

    You can register for SQL Saturday #477 – Atlanta – BI Edition here. I hope to see you there!


  • Presenting “Use Biml to Automate SSIS Design Patterns” at SQL Saturday-Nashville!

    I’m honored to present “Use Biml to Automate SSIS Design Patterns” at SQL Saturday – Nashville 16 Jan 2016!


    Perhaps you’ve heard the buzz about Business Intelligence Markup Language (Biml) and wondered, “Why is everyone so excited about Biml?” Occasionally, a new technology emerges that changes everything. For SSIS developers, Biml is one such technology. Business Intelligence Markup Language (Biml) is a powerful solution for rapidly creating SSIS packages and solutions. SSIS Design Patterns support various load profiles. Combined, Biml and SSIS Design Patterns offer a compelling solution for automating enterprise data integration.
    In this session, you will learn: 
    -	How to rapidly build multiple SSIS packages using Biml
    -	How to apply different SSIS Design Patterns using Biml
    -	How to automate SSIS Design Patterns using Biml
    Register today!
    I hope to see you there!
  • My New Gig: Enterprise Data & Analytics!

    Enterprise Data & Analytics

    I’m excited to announce my next venture: Enterprise Data & Analytics!

    My role is still Data Philosopher. My job is to help clients realize the potential of their enterprise data. “How will you do that, Andy?” I’m glad you asked.

    EnterpriseDNA delivers consulting, analysis, and development to support:

    • Business Intelligence and Analytics
    • Data Strategy
    • Coaching
    • Enterprise Architecture
    • Performance Tuning
    • Application Development
    • Database and Data Warehouse Development

    EnterpriseDNA is committed to the developer and database communities. We’re a Gold sponsor of the very first BI-focused SQL Saturday in Atlanta and I am presenting a precon before the event titled SSIS Design Patterns and Biml: A Day of Intelligent Data Integration on 8 Jan 2016. (Register here!)

    I’m looking forward to the new venture!


  • Announcing SSIS Catalog Reports v0.1

    I’ve released a (very early) version of SSIS Catalog Reports – v0.1 – on GitHub. You can access the project here.

    My main goal is to provide some visibility into the SSIS Catalog without the need for SQL Server Management Studio. “Why, Andy?” I’m glad you asked. In some enterprises, there may be:

    • Operations people monitoring data integration;
    • Stakeholders who are less-database-savvy; and / or
    • DevOps-ish release / configurations management teams.

    Some (not all) of these folks could benefit from access to SSIS Catalog Reports apart from access to the other stuff in SQL Server Management Studio (SSMS) – especially in Production. SSRS provides a browser-based and read-only view into the enterprise Production data integration environment.

    Feedback is appreciated.


  • Leaving Linchpin People

    I’m sad and excited to announce I am no longer with Linchpin People.

    I’m sad because I will miss working with and leading the awesome team at Linchpin People. I’m excited to return to independent consulting as Andy Leonard Consulting!

    Update: My new venture is Enterprise Data & Analytics! 


  • Presenting at BIG PASS 10 Dec 2015!

    I am honored to announce I will be presenting (remotely) “I See a Control Flow Tab. Now What?” at BIG PASS in Lake Forest, California 10 Dec 2015 at 6:30 PM PT!

    This demo-intense presentation is for beginners and developers just getting started with SSIS. Attend and learn how to build SSIS packages from the ground up.

    Register today!


  • SSIS Catalog Compare 3-Minute Drill

    The actual time is about 3:15…

    In a new video, I demonstrate how to use SSIS Catalog Compare to deploy an SSIS Project and all configurations metadata from one SSIS Catalog to another. I call it the 3-Minute Drill.



    Learn more:

    Catalog Compare 0.5 Browsing the Catalog
    Catalog Compare 0.5 Comparing Catalogs
    SSIS Catalog Compare

    Advanced SSIS Training with Andy Leonard & Tim Mitchell in Reston Virginia Dec 7 - 10, 2015
    IESSIS1: Immersion Event on Learning SQL Server Integration Services
    IESSIS2: Immersion Event on Advanced SQL Server Integration Services

    Stairway to Integration Services

    SSIS Design Patterns

    Andy Leonard Consulting – I am here to help.

  • Announcing Three Free Biml and SSIS Webinars

    Thanks to everyone who attended the Biml 101 webinar! The recording and demo files for Biml 101 are available here.

    I’ve scheduled three additional webinars in the series:

    Other folks from the Biml Community will be joining me for presentations beginning in 2016. Since there’s so much going on, I’ve started Andy's Biml mailing list to help keep you aware of Biml training from Andy Leonard Consulting. I hope you’ll join and I promise to keep the email to a minimum (and you can always unsubscribe if it gets to be too much). Consider joining today!


    Learn more:

    Related Training
    Advanced SSIS Training with Andy Leonard & Tim Mitchell in Reston Virginia Dec 7 - 10, 2015
    IESSIS1: Immersion Event on Learning SQL Server Integration Services
    IESSIS2: Immersion Event on Advanced SQL Server Integration Services

    Related Articles
    Stairway to Integration Services
    Stairway to Biml

    Related Resources
    Join Andy's Biml mailing list!

    Related Books
    SSIS Design Patterns

    Andy Leonard Consulting – I am here to help.

  • PASS Summit 2015 Feedback

    We are part of an awesome community. The SQL Server Community or SQL Family, as many call it, is astounding. Our annual Big family reunion is the PASS Summit. I received feedback from attendee evaluations the other day and I would like to share some of the feedback for one reason: Many people spoke at the PASS Summit 2015 for the very first time in October 2015. Presenting at the largest SQL Server conference on the planet is daunting. Speaking there the first time is flat-out intimidating.

    My primary audience for this post is presenters, especially that handful of people who spoke for the first time at the PASS Summit 2015. But I think a broader audience of speakers with limited experience or people who desire to become speakers may also benefit from these words (that’s why I decided to blog about it!).

    “How’d You Do at the PASS Summit 2015, Andy?”

    I’m glad you asked. I had a great time presenting! Why is that my first response? Who cares about how I felt, for goodness sake? I was there to do a job, right? Well, yes and no. In exchange for presenting, I am allowed to attend the PASS Summit for free. Paying attendees are there to listen to me deliver my presentation, so I am working for them. As a Data Philosopher at Andy Leonard Consulting, part of my job is to find gainful employment in the form of training or consulting gigs. In that sense, presenting at the PASS Summit 2015 (or anywhere) is at least partially work-related.

    But work is not why I present.

    I present because I love learning. I get excited about learning new stuff, whether it’s technology or something else. I always have. I suspect I always will. Why? I don’t know. But there’s no denying I thoroughly enjoy learning new stuff.

    Presenting is one way to share this love with others. And sharing the love is why I present.

    What makes for a great time presenting?

    1. I know the material.
    2. I enjoy the material.
    3. I have a story to tell.
    4. I have “a good story-telling day.”

    That last one is tricky so I will explain. I suffer from have ADHD. Sometimes ADHD is a blessing. When presenting, though, I need to be as focused as possible. ADHD does not promote focus.

    “What Feedback Did You Receive From Summit Attendees, Andy?”

    Ah, you’ve learned well. It’s always best to ask Data Philosophers with ADHD specific questions. I feel the ratings and feedback were perhaps the highest and best I have ever received from attendees of the PASS Summit.

    I’m a huge fan of the 3-point rating system employed by the PASS Program Committee for evaluations this year. There were only four questions on the evaluation. I admire those decisions greatly. Kudos to the PASS Summit Program Committee!

    The questions asked were:

    1. Overall Session - Did the title, abstract, and session level align to what was presented?
    2. Session Content – Was the content useful and relevant?
    3. Presentation - Was the speaker articulate, prepared, and knowledgeable on the subject matter?
    4. Promotions - Did the session contain blatant sales material or self-promotion such as blogs, company, or services?

    My “nummers” (a Hazzard-ism):

    Use Biml to Automate SSIS Design Patterns

    • Overall Session: 2.96
    • Session Content: 2.85
    • Presentation: 2.92
    • Promotions: 1.23

    A sample of comments:

    • Speaker was good.  Not a level 400 session.
    • Very good content but a little bit hard to follow at times.  The concepts did get through though.
    • I enjoy sessions with Andy.  That said, he treads a fine line with promoting BIML the product versus the usefulness of using BIML. 
    • Lots of info efficiently presented, straight to the point. Great brewing of ideas to automate package creation using patterns, possibilities through the roof when you follow standards and guidelines.
    • This is the best session I have seen Andy do; SQL Server conferences are awash with introductory material on BIML, but very few presenters are addressing the complexities that arise in non-trivial BIML projects.
    • Best session so far. Fantastic information and presentation.
    • Loved it. Very excited to use moving forward. He did say, "Tough", though.

    My thoughts:

    I’m not as good as the positive comments indicate. The first clue is the negative comments. I’m always fascinated by comments on the level of a session. Paul Randal and Adam Machanic touch on the topic of Level in their excellent posts (click the links and read if you’re a presenter or want to be a presenter). Peeves make lousy pets. That said, comments about level are a pet peeve. Why?


    Three points…

    Every single session I deliver is going to contain material that is below the session level. Every. single. one. I was an instructor before I started working with database technology. Instruction always involves ramping up to a topic – starting with the familiar and working into the unfamiliar. Unless you’re marketing. I don’t do marketing – at least not intentionally.

    I’m not sure how to fix the complaint in the comment. I’m not even sure what the complaint is. Is the issue that the session was really a 500-level? a 300-level? I’m going to assume the complaint is that the session didn’t rise to the advertised level of 400, which brings me to…

    Your “400” is different from the “400” of others. In addition (pun intended), my “400” on SSIS and Biml is different from my “400” on C#. The session level is not static. Questions from attendees can impact the overall level of the material delivered. Each time I’ve followed up on similar comments, I’ve learned the person is communicating, “I expected to learn more than you shared.” Now that’s a useful comment! I either need to deliver more or do a better job communicating my intentions in the abstract (see Adam’s awesome post). As a data professional, I have way too much respect for numbers to see them bandied about in this manner. It bugs me. (Can you tell?)


    The last comment is a light-hearted jab. You will have to listen to the session recording to get the inside joke.

    Hacking the SSIS Catalog

    • Overall Session: 2.90
    • Session Content: 2.87
    • Presentation: 3.00
    • Promotions: 1.14

    A sample of comments:

    • I did pick up a couple interesting things, but I left the session feeling frustrated.  This felt more like a 200 level session instead of 400 level that I was expecting.
    • My only issue was that the session built on a previous (biml) session that Andy had done previously. It wasn't needed to understand the content but it would have been nice to list that as a helpful pre-requisite, since he used content from that previous… (comment was truncated)
    • Andy had a couple of issues with his demo, but handled it like the professional that he is.  He moved on and continued presenting the next point in his presentation.  He even poked fun at himself which I feel was a good add for the attendees.
    • I barely understood most of what this was explaining, but it was one of my favorite sessions!  The style and pacing was just right to keep me caught up on a subject I know very little about.
    • I thoroughly enjoyed the presentation and especially the speaker. Easily gets my vote for the best speaker of the week!  That was fun learning. Thank you so much!
    • Andy seems like a "mad genius"! His presentation was my favorite of the whole conference. The material was truly one-of-a-kind and extremely useful. Andy's delivery of the material was both confident and hilarious. I won't miss the chance to attend another
    • Andy does what he wants and did an outstanding job on how simply add some custom tables, procedures, and views to easily look and manage dw load

    My thoughts:

    The comment on session level is actionable. (I’ve beat that horse enough for one blog post.) The comment about connecting material from the two sessions is very fair. The truth is, I made the decision to connect them after delivering the first session. As the commenter points out, it didn’t matter which SSIS packages I used to demonstrate customizing SSIS package execution within the SSIS Catalog. But it’s fair to point out that I didn’t disclose I would be using SSIS packages from the other session demos prior to the beginning of this presentation. I had issues with a demo because I built something on-stage in front of everyone without rehearsing it first. My friends who are fantastic presenters tell me I should not do this. I respectfully disagree. I am sharing about software development (Biml and SSIS are software development). I will continue to build ad hoc demos in front of people during presentations – and fail – because I want everyone in the room to understand that failure is a normal (and good) part of software development.

    Again, the last comment contains an inside joke from the session that will make more sense if you listen to the recording.


    It is always an honor to present. I sincerely appreciate good feedback because it helps me improve. Thanks to everyone who attended my sessions and to everyone who will listen to the recordings when they become available!


  • Do You Need to Update Your PASS Profile or SQL Saturday Speaker Profile?

    I recently needed to update my PASS and SQL Saturday speaker profiles. I emailed my good friend and Awesome Community Person Karla Landrum for help, and Karla kindly responded with detailed instructions that even I could follow. I decided to capture and share my experience updating my PASS and SQL Saturday profiles.

    Updating Your PASS Profile 

    Log into


    Click on your name in the upper right corner of the PASS web site.


    Click the “myProfile” link on the left side of the page:


    Edit your profile!


    Updating You SQL Saturday Speaker Profile

    You may update your SQL Saturday Speaker Profile by logging into the SQL Saturday website and clicking any SQL Saturday event:


    Click the Speaker link at the top of the page:


    Click the Speaker Profile link:


    Edit your speaker profile!



    If you need to update your PASS profile or SQL Saturday speaker profile, I hope this information helps.


  • Learn to Code SSIS: The Execute SQL Task

    Note: This article was first published at SQL Authority.

    With “SQL Server” included in the name of SQL Server Integration Services (SSIS), it is easy for people to assume SSIS is simply a database tool or accessory. Let there be no mistake, SSIS is a software development platform. To give you a better understanding of how to use SSIS as a development platform, I’ll write a series of blogs that step through how to use Control Flow tasks. This first article will cover the Execute SQL Task.

    When developing solutions with SSIS, I use a handful of Control Flow tasks:

    • Execute SQL Task
    • Data Flow Task
    • Script Task
    • Execute Package Task
    • File System Task
    • Execute Process Task

    This is a list of the which tasks I use most – from most-used to least-used. In this article I provide a basic example of configuring the SSIS Execute SQL Task:


    Three things are required to configure an Execute SQL Task:

    1. Connection Type
    2. Connection Manager
    3. SQL Statement

    Connection Type

    The default Connection Type is OLE DB:


    I configure Execute SQL Tasks to execute SQL statements like:

    • truncate a table
    • update or insert a single row of data
    • call a stored procedure

    I typically use OLE DB or ADO.NET connection types; but I occasionally use ODBC connection types. When executing parameterized statements I find ADO.NET offers a cleaner interface.

    After selecting a connection type, selecting the Connection Manager is next. The Execute SQL Task Editor filters the list of available connection managers, displaying only connection managers of the (connection) type configured in the previous step:


    The last required step to configuring an SSIS Execute SQL Task is to supply an SQL Statement. There are two properties for this: SQLSourceType and SQLStatement. The SQL Source Type property specifies the source of the SQL Statement as one of the following:

    • Direct Input
    • File Connection
    • Variable

    The SQL Statement can be entered manually (Direct Input). It can be stored in a file (File Connection) or the SQL Statement can be stored in an SSIS variable (Variable). In most cases you will manually enter the query:


    Right-click the task and click “Execute Task” to test the task configuration. Or simply press the F5 key to execute the entire SSIS package:


    You now know the basics of configuring an SSIS Execute SQL Task. Go code!



    Learn more:

    Advanced SSIS Training with Andy Leonard & Tim Mitchell in Reston Virginia Dec 7 - 10, 2015
    IESSIS1: Immersion Event on Learning SQL Server Integration Services
    IESSIS2: Immersion Event on Advanced SQL Server Integration Services

    Stairway to Integration Services

    SSIS Design Patterns

    Andy Leonard Consulting – I am here to help.

  • Biml 101 Recording and Files Are Now Available!

    I had an awesome time sharing about Biml (Business Intelligence Markup Language) earlier today! Thank you to everyone who registered and turned out for the live presentation.

    For those unable to make it to the live presentation, the recording is now available for viewing.

    I’ve posted the Mist and SSIS solutions source code. You can get them here:

    Right after I finished the webinar, I realized I did not demonstrate importing a package using BimlOnline. Bad me! So I created a short video (~1 minute) to demonstrate. You can view it here.




    Learn more:

    Biml 102: Using Biml as an SSIS Design Patterns Engine
    Advanced SSIS Training with Andy Leonard & Tim Mitchell in Reston Virginia Dec 7 - 10, 2015
    IESSIS1: Immersion Event on Learning SQL Server Integration Services
    IESSIS2: Immersion Event on Advanced SQL Server Integration Services

    Stairway to Integration Services
    Stairway to Biml

    Join Andy's Biml mailing list!

    SSIS Design Patterns

    Andy Leonard Consulting – I am here to help.

  • Managing SSIS Catalog Project Parameter Values

    One of the coolest features of the SSIS Catalog is Catalog Environments. Catalog Environments offer a compelling and rich solution for Catalog Project Parameter value externalization and configuration management, whether the parameters are connections or parameter values. Configuring and using Catalog Environments involves four steps:

    1. Create the Catalog Environment
    2. Create and Configure the Catalog Environment Variables
    3. Reference the Catalog Environment from a Catalog Project
    4. Map Catalog Environment Variables to Catalog Project Connections and Parameters

    Create the Catalog Environment

    Create an SSIS Catalog Environment by connecting to an SSIS Catalog using SQL Server Management Studio (SSMS) Object Explorer. Expand the Integration Services Catalogs node, the SSISDB node, and the desired folder node. Right-click the Environments virtual folder and click “Create Environment…”:


    Give the environment a name and (optional) description:


    Click OK to create the Catalog Environment.

    Create and Configure the Catalog Environment Variables

    Return to SSMS Object Explorer and double-click the Catalog Environment to open it for editing. Add and configure Catalog Environment Variables on the Variables page:


    Click OK to save the Catalog Environment configuration changes.

    Reference the Catalog Environment from a Catalog Project

    In SSMS Object Explorer, right-click a Catalog Project and click “Configure…” to create a reference between the Catalog Project and Catalog Environment:


    When the Configure window opens, add a reference to a Catalog Environment on the References page. Catalog Environments are scoped at the SSIS Catalog Folder level. You can reference Catalog Environments in the same Catalog Folder as the Catalog Project or in a different Catalog Folder:


    Once referenced, the Catalog Environment will appear in the Reference grid on the Catalog Project Configure References page. The “.” in the Environment Folder column indicates the Catalog Environment resides in the same Catalog Folder as the Catalog Project:


    Map Catalog Environment Variables to Catalog Project Connections and Parameters

    On the Parameters page of the Configure window, you can map Catalog Environment Variables to values by clicking the ellipsis on the Parameter grid row that corresponds to the parameter you wish to configure:


    Clicking the ellipsis opens the Set Parameter Value dialog. Click the “Use environment variable” option and select the Catalog Environment Variable from the corresponding dropdown:


    The dropdown list is filtered to only display Catalog Environment Variables of a data type compatible with the Catalog Project Parameter value.

    Successful mapping is indicated by the underlined name of the Catalog Environment Variable in the Parameter Value column of the Parameters grid:


    Catalog Project Parameter Manual Overrides

    You can also manually override the value of a Catalog Project Parameter by clicking the ellipsis and select the “Edit value” option:


    When a Catalog Package Parameter value is manually overridden using the Edit value function, it displays in Bold font in the Parameter Value column of the Parameters grid. Catalog Package Parameter values that are not mapped to a Catalog Environment Variable or manually overridden are configured to execute with their design-time default values and the text in the Parameter Value column of the Parameters grid is not decorated:



    SSIS Catalog Environments provide a rich and robust solution for parameter externalization.


This Blog



My Companies

Community Awards

Friend of Red Gate

Contact Me


Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement