THE SQL Server Blog Spot on the Web

Welcome to - The SQL Server blog spot on the web Sign in | |
in Search

Andy Leonard

Andy Leonard is CSO of Linchpin People and SQLPeople, an SSIS Trainer, Consultant, and developer; a Business Intelligence Markup Language (Biml) developer; SQL Server database and data warehouse developer, community mentor, engineer, and farmer. He is a co-author of SQL Server Integration Services Design Patterns and Managing Geeks - A Journey of Leading by Doing, and author of the Stairway to Integration Services.

  • Seeing Sharply

    VS2015My technology career began as a curiosity in 1975. I was 11 years old and living next door to an elderly couple. Their son returned from an engineering career in the Air Force and began tinkering with a microprocessor kit. I was curious and he taught me Motorola 6800 machine language. After that, 40 years ago at the time of this writing, he taught me BASIC.

    I stuck with BASIC through GWBasic and other versions. When Visual Basic (VB) was released, I dove into the language, relishing the ability to develop Windows programs in a familiar syntax. About the time .Net came out I made the leap to database technology. Lots of my VB friends made the leap to C#. I tried – even writing the demos for my second book project (Professional Software Testing with Visual Studio 2005 Team System) in C#. But for some reason, my brain refused to grasp C#.

    And then I began working with Kevin Hazzard.

    Kevin is inspirational. And cool. And funny. And intelligent. And cool. And mostly patient, which benefited me greatly. When he found himself dropped into the middle of an SSIS project and needing a little guidance, I took the opportunity to swap him C# tutoring for SSIS tutoring. It worked out well, and I have been developing with C# for several months now. I have a long way to go before I will venture to call myself a C# developer (and it may never happen). But I am comfortable with the language, and that’s something I could not say (or write) a year ago.

    Thanks, Kevin, for helping me take another step on my DevJourney.


  • SQL Server 2016 Temporal Tables and Type II Data Warehouse Dimensions

    SQL Server 2016 introduces temporal tables, an interesting and “data warehouse-y” addition to the relational database.

    So how do temporal tables work? They add a couple columns to a table that are used to track the date a row is inserted or last updated, based on the primary key. Suppose you have a row that looks like this:


    If you begin tracking this data temporally, the row could appear similar to this:


    Note the dateTime2 data type used for the date-range fields. A ToDate value of 9999-12-31 23:59:59.9 indicates this row is “current.” The value in the FromDate field indicates when the row was inserted or last updated, whichever happened last.

    How did we get here? Let’s build out a demo. But first, a shout-out to Damian Widera, who’s SQLBlog post titled “Temporal table behind the scenes - part 1” raised my interest in temporal tables.


    You can begin by creating a database. I created TemporalTests using this following statement:

    If Not Exists(Select name
              From sys.databases
              Where name = 'TemporalTests')
           Create Database TemporalTests

    Next, I create a table named TestTable in TemporalTests:

    Use TemporalTests

    If Not Exists(Select name
                  From sys.tables
                   Where name = 'TestTable')
      Create Table dbo.TestTable
           Id Int Identity(1,1)
            Constraint PK_TestTable Primary Key,
           Name VarChar(20),
           Value Int,
           FromDate DateTime2(1) Generated Always As Row Start Not NULL,
           ToDate DateTime2(1) Generated Always As Row End Not NULL,
           Period For System_Time (FromDate, ToDate)
      ) With (System_Versioning = ON)

    I won’t go into all the details of the syntax, but I will point out three requirements:

    1. The table needs a primary key. This serves to identify the row and, I believe, should not be updated. This makes the primary key function a lot like a surrogate key in a data warehouse dimension.
    2. The table needs two DateTime2 fields configured similar to those shown (FromDate and ToDate) and a clause identifying these fields as the temporal value columns - Period For System_Time (FromDate, ToDate).
    3. The table needs the key phrase With (System_Versioning = ON) at the end of the Create Table statement (when creating a temporal table).

    When created, the table in Object Browser appears as shown:


    Note you can supply a name for the temporal system table, but if you don’t supply a name SQL Server will create one for you.

    Let’s load some data!

    Insert Into dbo.TestTable
       (Name, Value)
       ('A', 0)
    , ('B', 1)
    , ('C', 2)
    , ('D', 3)
    , ('E', 4)

    If we query the table, we’ll see all the rows have the same FromDate and ToDate:


    Now, if we update a few rows:

    Update dbo.TestTable
    Set Value = Value * 10
    Where Value > 2

    The table now has some rows – the last two – with different FromDate values:


    We can now exercise a Type-II-like dimension query to examine the state of the rows at some time before the latest update:

    Select *
    From dbo.TestTable
    For SYSTEM_TIME As Of '2015-09-18 22:13:00.0'

    The results? The values as they were at the time specified in the For SYSTEM_TIME As Of '2015-09-18 22:13:00.0' clause:


    Cool? I thought so too.

    I still have unanswered questions about temporal tables for Type-II dimension tracking. The largest question regards performance at scale. We’ll see once we get out of the SQL Server 2016 CTPs…


  • Presenting at SQL Saturday 431–Spartanburg 26 Sep!

    I am honored to present two sessions at SQL Saturday #431 in Spartanburg, SC on 26 Sep 2015! I will be delivering:

    SQL Saturday #431 also features two stellar precons: Transforming and Visualizing Data with Power BI by Patrick LeBlanc and Getting the New DBA Up to Speed with Performance Tuning by Ed Watson and Mike Lawell.

    There are lots of awesome speakers at SQL Saturday #431 – check out the schedule for more information.

    Register and, if you read this blog, introduce yourself!


  • Opinions Change

    Dewey did *NOT* defeat Truman!

    Opinions change, and they should change as more and better evidence surfaces. There are words for this: “learning” and “growing” are but two of them.

    Communication plays a key role in changing opinions. I’ve been studying communication recently and I’ve decided it’s a miracle anyone is ever able to communicate with anyone else. There are several reasons; one reason is we’re wrong more than we are right.

    On Being Wrong

    In her TED talk Kathryn Schulz, "Wrongologist," states “We misunderstand signs around us.” "I can't actually think about anything I'm wrong about. At least not in the present tense." "So effectively we all wind up… traveling through life… trapped in this little bubble of feeling very right about everything." "How does it feel - emotionally - how does it feel to be wrong? How does it feel to realize you're wrong?"

    Her main point is: "Just being wrong doesn't feel like anything."

    Kathryn goes on to explain: "The way to succeed in life is to never make any mistakes." "...getting something wrong means there's something wrong with us."

    In response:

    1. We assume people are ignorant.
    2. We assume people are idiots.
    3. We assume people are evil.

    "This attachment to our own rightness keeps us from preventing mistakes when we absolutely need to, and causes us to treat each other terribly." "The miracle of your mind isn't that you can see the world as it is. It's that you can see the world as it isn't."

    Her (Brilliant, in my opinion) Conclusion:

    "It does feel like something to be wrong. It feels like being right."

    I encourage you to take a few minutes to listen to the TED talk. It’s awesome, as is Kathryn’s book:



    Learn more:


  • While You Have It…

    I remind my children who live at home - the "junior engineers" - to deal with the things they have in their hands while they have them in their hands. Why? Because it’s inefficient to put something down only to pick it up later and put it where it belongs. Like all rules, there are exceptions to this.

    I also tell them to spend an extra 30 seconds each evening putting their shoes in the same place. When they forget to do this, they spend 10 minutes the next morning searching for their shoes. A little planning goes a long way.

    Deal with stuff while you have your hands on it, while it’s in front of you. And get it done.


    Learn more:


  • Presenting SSIS 2014 Data Flow Tuning Tips and Tricks at Triad PASS BI 25 Aug 2015!

    I am honored to present SSIS 2014 Data Flow Tuning Tips and Tricks to the Triad PASS BI chapter in Winston-Salem 25 Aug 2015.

    Do you want SSIS to go fast? This session is for you! Attend and learn techniques for developing, instrumenting, monitoring, and managing SSIS 2014 Data Flow performance in your data integration enterprise.
    Session Level: Beginner-Intermediate

    Register today!


  • SSIS Catalog Logging and Reporting in the Enterprise

    The SSIS Catalog, first introduced with the release of SQL Server 2012 Integration Services (SSIS 2012), is a big step forward in Data Integration Lifecycle Management (DILM) with SQL Server Integration Services (SSIS). Like all software, the SSIS Catalog performs some tasks admirably and other tasks… not so admirably. In this post, I will share some thoughts on SSIS Catalog logging.

    The Good


    My favorite SSIS Catalog feature is the integrated logging. It Just Works. I configure a default Logging Level when I create an SSIS Catalog (or later) and I don’t have to think about logging unless or until I want to change the Logging Level for some reason. And I can make that change on a per-package-execution basis, even. SSIS Catalog logging is fast and requires little or no work on my part to maintain. Other logging defaults configured at the time of the Catalog creation (or later) include the Retention Period and Clean Logs Periodically properties:


    I can run my data integration enterprise with these defaults. And I sometimes do.

    The OK

    Time-Deterministic Logging

    The SSIS Catalog will log some events in an order that is different from the order in which the events occurred in real time. In my experience, it’s possible to see events that occur within a few milliseconds logged out of order. I’ve explained this to a few auditors and, to date, none have disqualified a system because of this behavior.

    The Bad

    Log Maintenance

    The way the SSIS (in 2012 and 2014, at least) Catalog cleans these records can be troublesome. I won’t bore you with the technical details. I will just tell you that if you have lots of execution and event metadata stored in the SSIS Catalog log tables, the log cleaning process can take a long time and consume lots of server resources.


    This is potentially Ugly, but it is at least Bad. In order for someone to view the SSIS Catalog reports that ship with the solution (in SSIS 2012 and 2014), the following must be true:

    • The user must have SQL Server Management Studio (SSMS) installed.
    • The user must be a member of the ssis_admin or sysadmin roles.

    The reports are excellent:


    I have no issue with Database Administrators (DBAs), Database Developers, or Data Integration Developers using SSMS and the SSIS Catalog reports. I worry some about Operations people and project stakeholders needing SSMS and elevated permissions just so they can access operational or historical enterprise data integration information. Am I saying I do not want Operations people and project stakeholders to see the data contained in the SSIS Catalog reports? No, I most certainly am not. My concern is that someone will accidentally delete a mission-critical SSIS package from a Production SSIS Catalog. (Note: DBAs, Database Developers, and Data Integration Developers are just as at risk of accidentally deleting something as Operations people or project stakeholders.)

    You see, if you are a member of either the ssis_admin or sysadmin role, you have permission to delete SSIS packages. You also have permission to do a whole lot of other things that, frankly, you shouldn’t need to do.

    The Ugly

    The defaults for SSIS Catalog logging properties are shown in the screenshot above. The default logging level is Basic. I like that setting. I also like cleaning the logs periodically.

    I cringe a little at the default retention period. Why? Experience. I’ve been an enterprise data integration architect for years. Doing that job will teach you to think about stuff normal people don’t consider. Here’s why I cringe at a 365-day default retention period: What happens 1 Jan? Most enterprises run mission-critical, regulation-required, end-of-year reports on or around 1 Jan. These processes may only execute once per year (but they’re tested each time a change is deployed, right? Don’t get me started…). They can be big and complex processes and they usually execute across a larger domain of data than nightly, weekly, monthly, or quarterly processing.

    What does that mean for the SSIS Catalog logs? More messages!

    Wait, it gets worse.

    Where are your support people on or around 1 Jan? Sitting at their desks watching these processes run to completion? Gosh, I hope not. Let’s say you manage to make it through that first SSIS-Catalog-managed year-end processing cycle without calling in too many people. Good. Fast-forward one year – or 365 days, as the Logging Retention Period property of the SSIS Catalog likes to call it. The default logging retention period also kicks in on or around 1 Jan (you think you have a 25% chance of hitting a leap year, but Murphy is way better at statistics than you or me…), right about the time your extra long data integration processing is trying to run. That smell coming from your server? That’s smoke.



    I can hear you thinking, “Great. Now what, Andy?” I’m glad you asked! Ask yourself, “Self, do we really need a year of historical enterprise data integration logging?”

    If the answer is “Yes, we need at least a year of log data,”  then I recommend you increase the Retention Period (days) property to some number greater than 365 – maybe 380, provided fifteen days will get your enterprise beyond the year-end process. The goal is to have the log maintenance from last year’s year-end processing occur sometime after this year’s year-end processing.

    If the answer is “No, we don’t need a year a log data,” I recommend you set the Retention Period (days) property to the number of days you need to retain.


    If you’re reading this and you have an SSIS Catalog in your enterprise, then you also have SQL Server Reporting Services (SSRS). You can write your own reports to execute against the logs and data integration metadata stored in the SSISDB database. SSRS reports can be deployed to run in a web browser. That makes them read-only from a user perspective.

    To be fair, the SSIS Catalog reports built into SSMS are also read-only. But SSMS is anything but read-only. Good DBAs will read this and (rightfully) think, “Well, enterprise security should define who has permissions to read and write and make changes to data in SSMS.” I concur, and that’s almost correct. The SSIS Catalog node in the SSMS Object Browser is different. Permissions checks are actually hard-coded into some of the SSIS Catalog Views, even. In order to view the reports, they must be a member of either ssis_admin or sysadmin.

    To also be fair, the user still must be a member of either ssis_admin or sysadmin to view data via SSIS Catalog Views from an SSRS solution deployed so that it runs in a web browser. Two thoughts:

    1. It’s a web browser and the reports are (hopefully) only executing SELECT statements.
    2. If you design your own reporting solution in SSRS, you can define how you access the data. You do not need to use the SSIS Catalog-supplied views with their hard-coded role checks, you can write your own views or stored procedures.

    I and others have built custom SSRS reporting solutions that read the same SSIS Catalog execution logs and metadata. At Linchpin People, we are here to help™.


    The SSIS Catalog contains a great set of Data Integration Lifecycle Management tools, reports, and functionality. The logging and reporting is very good, but there are things of which you should be aware.


    I am here to help:
    Contact Andy!

    Learn more:
    SSIS Design Patterns training in London, UK, 7-10 Sep 2015    
    Advanced SSIS Training with Andy Leonard & Tim Mitchell in Reston Virginia Dec 7 - 10, 2015

    DevOps and Data Integration with SSIS (at IT/Dev Connections in Las Vegas, NV 15 Sep 2015)
    Designing an SSIS Framework (at IT/Dev Connections in Las Vegas, NV 16 Sep 2015)
    Using Biml to Automate SSIS Design Patterns (at the PASS Summit 2015 in Seattle, WA 27-30 Oct 2015)
    Hacking the SSIS Catalog (at the PASS Summit 2015 in Seattle, WA 27-30 Oct 2015)

    Stairway to Integration Services
    Linchpin People Blog: SSIS
    Stairway to Biml

    SSIS Design Patterns

  • SSIS Design Pattern: Access Parent Variables from a Child Package in the SSIS Catalog

    I am regularly asked about passing variable values between SSIS packages.

    Tim Mitchell (Blog | @Tim_Mitchell) wrote an excellent Notes from the Field article for the SQL Authority blog called SSIS Parameters in Parent-Child ETL Architectures. Before the first version of the SSIS Catalog was even released, Jamie Thomson blogged about The new Execute Package Task in SSIS in Denali and Eric Johnson blogged about Calling Child Packages in SSIS here at SQLBlog. In pre-SSIS Catalog days, I blogged about this topic at SQLBlog: Accessing Parent Package Variables In Child Packages... Without Configurations. There are many other posts related to accessing variable values in SSIS.

    This topic has received a fair amount of treatment.

    “Why are you writing about this again, Andy?”

    I’m glad you asked! I’ve learned something new about the behavior I described in my 2007 post. I’ve discovered a difference in the way SSIS parent-child package variable values interact in the SSIS Catalog and in SSDT-BI – the designer.

    Disclaimer: To be fair, the behavior I described in 2007 was not a design feature of SSIS. It worked, but it was (and is) a by-product of the way the original Execute Package Task was designed. I was given fair warning by the SSIS team at Microsoft that there were no guarantees this functionality would exist in future versions of the product.

    Another disclaimer: I am not picking on Microsoft! My intent in writing this is to inform those who read my 2007 post that this functionality still exists but is reported differently than some might expect, and to give you some sense of how things are different. Cool? K. Let’s move forward.

    Show and Tell

    Let’s build a demo SSIS project. I called mine ParentChildTests and added two SSIS packages named EPTParent.dtsx and Pkg1.dtsx:


    In the EPTParent.dtsx SSIS package, I add a package-scoped string variable named ParentVariable (creative name, I know!) with some default value:


    In the Pkg1.dtsx SSIS package, I add a Script Task configured to read the ReadOnlyVariables User::ParentVariable and System::PackageName. The Script Task contains the following code:

                            string packageName = Dts.Variables["System::PackageName"].Value.ToString();
                string msg = "I am " + packageName;
                bool fireAgain = true;

                Dts.Events.FireInformation(1001, "Script", msg, "", 0, ref fireAgain);

                string parentVariable = Dts.Variables["User::ParentVariable"].Value.ToString();
                msg = "Parent Variable: " + parentVariable;

                Dts.Events.FireInformation(1001, "Script", msg, "", 0, ref fireAgain);

    The purpose of the Script Task is to generate a couple log messages by raising Information events. Please note: there is no variable in Pkg1.dtsx named ParentVariable! If you create a variable named ParentVariable in Pkg1.dtsx, this demo will not work. (You can validate the script before closing the VSTA window. I show you how here!)

    You can test-execute Pkg1.dtsx to see if it fails (it should fail):


    “Wait, what, Andy? You’re teaching us how to write SSIS packages that fail?” Yes. Yes, I am. On purpose, even. Why? Because you are going to write SSIS packages that fail. Teaching you anything else would be disingenuous. If failure bothers you, you may want to consider a different line of work.

    The key here is the package should fail and return the error:

    Failed to lock variable “User::ParentVariable” for read access…

    If you see this error, you’ve done everything I described correctly (so far).

    This error means there is a caveat: You cannot successfully execute this package stand-alone; to succeed, this SSIS package must be executed via the parent SSIS package (EPTParent.dtsx in this case).

    In the EPTParent.dtsx SSIS package, I add an Execute Package Task configured to execute Pkg1.dtsx:


    Before we test-execute EPTParent.dtsx in SQL Server Data Tools – Business Intelligence (SSDT-BI), I want to explain what I expect to happen. I expect EPTParent.dtsx to execute. I expect the Execute Package Task in EPTParent.dtsx to execute and, in turn, execute the Pkg1.dtsx SSIS package. I expect Pkg1.dtsx to execute – successfully this time – leveraging and undocumented (and unsupported) feature in the SSIS Execute Package Task that allows child packages to access parent package variables.

    Now you may be thinking, “Big deal, Andy. I can just use the Parent Package Variable Package Configuration to read the value of a parent package in a child package.” And you would be correct in your thinking. But I was very careful about the words I wrote just now, and I wrote “access.” Why didn’t I just write “read” since that’s all we’re doing here? Because it’s possible, using this very functionality, to also write a value from a child package to a parent package. That’s not possible using a Parent Package Variable Package Configuration.

    Taking a look at the Progress / Execution Results tab of Pkg1.dtsx, we see exactly what I thought I would see – a successful execution and an information message reporting the value of ParentVariable:


    “Why Are You Telling Me This?”

    Again, I’m glad you asked. So far, all I’ve showed you is what I wrote about in 2007.

    Here’s what’s changed. If I deploy this project to the SSIS Catalog execute it, and view the Catalog reports, I will not see two package executions. Instead I will see a single package execution for EPTParent.dtsx:


    The Execution Overview table on the Overview report provides some clarity – especially the Task Name column:


    Clicking the View Messages link shows even more detail, and the last few messages (sorted  as last-occurred, first-listed) demonstrate Pkg1’s Script Task executed “inside” Pkg1, and provides some evidence that Pkg1 executed “inside” the Execute Package Task in the EPTParent.dtsx SSIS package:



    Is this merely trivial or anecdotal? As with so many other questions in the data field, the answer is “it depends.”

    If you need (or merely desire) visibility into your enterprise data integration lifecycle, my answer would be “no, this is not trivial or anecdotal.” Child package executions – when the child is executed via the Execute Package Task – are hidden from the “All Executions” report view. An SSIS package executed and it doesn’t show up. If you understand why it doesn’t show up, this is no big deal. But the report is not titled “Most Executions,” is it?

    “Why does this happen?”

    I go into more detail about this behavior in my article Advanced Event Behavior – Level 10 of the Stairway to Integration Services – part of the Stairway to SSIS series at SQL Server Central. That article includes this image (Figure 43), my attempt at constructing a visual representation that SSIS will never produce in real life. A Parent package – named Parent.dtsx – executes. It contains an Execute Package Task that executes a package named Precedence.dtsx. To construct this image, I used two screenshots of the Package Explorer tab – one from Parent.dtsx and one from Precedence.dtsx:


    This is one representation of what happens when a child SSIS package is executed from a parent SSIS package. It shows the child SSIS package running “in scope” of the Execute Package Task, and I believe that is an accurate representation. And I think that’s why we do not see the child package executions listed on the All Executions SSIS Catalog report.


    Learn more:
    SSIS Design Patterns training in London, UK, 7-10 Sep 2015    
    Advanced SSIS Training with Andy Leonard & Tim Mitchell in Reston Virginia Dec 7 - 10, 2015

    Stairway to Integration Services
    Linchpin People Blog: SSIS
    Stairway to Biml

    SSIS Design Patterns

  • What Customers Want

    Not once, in my combined years of direct employment by enterprises or consulting, has a customer contacted me about an issue about the grain of their date dimension in their enterprise data warehouse. Not one single time. I haven’t been contacted about cardinality either. Or trouble with execution threading models. Nope. Not ever.

    Instead, I’ve been contacted about other stuff. Stuff like the accuracy and “drill-ability” of a report and how long it takes to load data and how up-to-date the data is. Those are the concerns customers tell me they want addressed.

    Often the solution to the problem they are trying to solve involves data warehouse grain or cardinality or threading.

    Are my customers lying to me? Absolutely not! Part of my job is to comprehend what my customers perceive, to find out what they believe is the problem they’re trying to solve. “This report takes too long to display,” is a valid description of a problem that needs solving. You don’t need to know what a tuple is to help. You don’t need to understand how indexes traverse b-trees to get what they’re telling you.  You don’t even need a degree in computer science to understand what they’re saying: The report is slow.

    All you need to do is listen.

    What customers want is your help.


    Learn more:


  • Validate SSIS Scripts Before Closing the VSTA Window

    I love learning new stuff!

    I’m blessed to work with a talented group of individuals at Linchpin People. We often work together in teams and I have to admit, I cherish those times. One reason is I learn something new. Every. single. time.

    For years, I’ve validated SSIS Script Task code by searching for red and blue squiggly lines before closing the VSTA window:


    With a small amount of scripting, the squiggle-search is a fine method for catching typos and similar errors. For larger amounts of code, though, this could take time and one could easily overlook a typo or squiggle.

    I learned a neat trick while pair programming with Kevin Hazzard (Metaprogramming in .Net | @kevinhazzard): I can execute a Build before leaving the VSTA window and test the viability of the code.

    From the VSTA window dropdown, click BUILD and then Build <VSTA Project Name>:


    If there are errors (and there are a couple errors in my string variable declaration above), they will appear in the Error window:


    If there are no errors or after I have corrected all errors, the build will succeed and I will see a note in the lower left corner of the VSTA window:



    Learn more:  
    SSIS Design Patterns training in London, UK, 7-10 Sep 2015
    Advanced SSIS Training with Andy Leonard & Tim Mitchell in Reston Virginia Dec 7 - 10, 2015

    Stairway to Integration Services
    Linchpin People Blog: SSIS
    Stairway to Biml

    SSIS Design Patterns

  • Own the SSIS Design Patterns E-Book for $10–Today Only!

    Own the SSIS Design Patterns (2nd edition - updated for SSIS 2014) ebook today for just $10!


    Learn more: 
    Stairway to Biml
    Linchpin People Blog: SSIS
    Stairway to Integration Services

    SSIS Design Patterns

  • Presenting Using Biml as an SSIS Design Patterns Engine at Hampton Roads .Net UG 11 Aug

    I am honored to present Using Biml as an SSIS Design Patterns Engine at the Hampton Roads .Net Users Group (@HRNUG) 11 Aug 2015!

    Business Intelligence Markup Language provides a powerful solution for creating and managing SSIS Design Patterns. Andy Leonard, one of the authors of SSIS Design Patterns, demonstrates the flexibility of Biml in this session.

    If you’ll be in the area, stop by and introduce yourself!


    Learn more: 
    Stairway to Biml
    Linchpin People Blog: SSIS
    Stairway to Integration Services

    SSIS Design Patterns

  • Interested in Training - From Zero To SSIS?

    I’ve received a couple inquiries recently about SSIS training for people new (or new-ish) to data integration. I have delivered a course called From Zero To SSIS in the past, but not recently.

    If you are interested in this training, please contact me by clicking the Email link in the upper right frame.

    If enough people express interest, I will be happy to schedule a delivery.


    Learn more: 
    Stairway to Biml
    Linchpin People Blog: SSIS
    Stairway to Integration Services

    SSIS Design Patterns

  • SQL Formatting Matters…

    … to some folks more than others.

    We likely all have stories about working with folks who care a lot about the format of their SQL statements. I’ve worked with a lot of database developers and analysts. I cannot say SQL formatting distinguishes the good from the bad (or ugly), but I have noticed the sharpest developers have a preferred SQL format. I’ve also noticed the very sharpest are the most dogmatic about the format they prefer.

    Why is this?

    I don’t know for sure. I’ve asked, and a few shared that their preferred SQL format helps them visualize or conceptualize the purpose of the statement. I can buy that. When working with folks who have a SQL format preference, I try to only send them SQL formatted as they like.

    How about you? Does format matter? If so, why?


    Learn more: 
    Stairway to Biml
    Linchpin People Blog: SSIS
    Stairway to Integration Services

    SSIS Design Patterns

  • Stairway to Integration Services Level 21 is Live!

    Extending Custom Execution in the SSIS Catalog – Level 21 of the Stairway to Integration Services is now live at SQL Server Central!



    Learn more: 
    Stairway to Biml
    Linchpin People Blog: SSIS
    Stairway to Integration Services

    SSIS Design Patterns

More Posts Next page »

This Blog



Friend of Red Gate

My Company

Blog Roll

Check out the Linchpin People Blog...
Linchpin People Blog

Contact Me


Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement