THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

Sergio Govoni

A place where share my experience on SQL Server and related technologies

  • The free #eBooks series #SQLServer and .NET Blog

    The series of free eBooks "The SQL Server and .NET Blog eBook Series" by Artemakis Artemiou (@artemakis) collects dozens of articles explaining how to solve practical problems in these areas:
     
     
    Developing SQL Server (The SQL Server and .NET eBook Series)

    This book is for database developers and architects who develop SQL Server databases and related database processes. The book features tens of articles that provide detailed information on how to develop in SQL Server. However, the book is not intended to be a step-by-step comprehensive guide. Additionally, it assumes at least intermediate-level experience with SQL Server development and knowledge of basic database principles.

     
     
    Administering SQL Server (The SQL Server and .NET eBook Series)

    This book is for database administrators and architects who monitor and administer SQL Server instances in order to keep them operating to the highest possible level of stability and performance. The book suggests several techniques that can be used for ensuring a healthy SQL Server instance.

     
     
    Tuning SQL Server (The SQL Server and .NET eBook Series)

    This book is for database administrators and architects who monitor and tune SQL Server instances in order to keep them operating to the maximum possible performance and stability. The book suggests several techniques that can be used for ensuring a performant SQL Server instance.
     
     
     
    These three ebooks can not miss in your digital library!
  • SQL Saturday #454 – A couple of days after #sqlsat454

    Slides and demo files of the sessions we have made at SQL Saturday #454 in Turin are available for download through the schedule page.

    This is a good occasion to say "Thanks" to Politecnico di Torino (that has hosted the event), Organizers, Sponsors and Speakers. A big Thanks also to who have attended this event, you have made of this event a great event!

    We has fun very much :) and the photos posted on Twitter prove it!

  • PASS Summit is back! Looks into the 2015 edition! #Summit15

    The PASS Summit is back! This year the most important event around the world for SQL Server Data Platform will be held again in Seattle (WA) from 27 to 30 October 2015 and it will be preceded from two pre-conference days from 25 to 26 October 2015.

    Why I wrote "The most important event around the world for SQL Server Data Platform"? Because numbers of the PASS Summit 2014 are self explanatory! Last year over 5,000 data-geeks have attended the PASS Summit which has become the landmark event for the entire Microsoft Data Platform. Take a look at numbers of the year 2014, that are shown in the following pictures, they are really impressive, aren't they?

     

     

    Every year more and more IT Professionals, Technicians, Analysts and Data Scientists consider this event as an unique opportunity to: Connect to other people that have the same passion you have, Share your experience and the problems you face for your job and Learn more and more about the Microsoft Data Platform for choose always the best technology to win the challenges of the market. Don't forget that you have also the opportunity to meet and talk to people like Conor Cunningham (one of the principal architect of the relational Engine of SQL Server and SQL Azure), Dr. David DeWitt (one of the biggest expert of Parallel Databases) and Mark Souza (General Manager in the Microsoft Data Platform Group). Are you thinking that people are too busy for a talk with you? Nobody of these big names walks away if you try to meet and talk them!

    If you are undecided about the quality of the content provided at the PASS Summit, you can watch the sessions of the previous year on PASS TV. You will have no doubts about the quality of the sessions :)

    Are you a beginner or a first timers at PASS Summit? Don't worry there are lots of sessions of level 100 and 200 and you can count on the "First Timers Guidebook" that it will arrive soon!

    What kind of sessions can you expect to find at the PASS Summit 2015? Find it out by the recording sessions of the last 24 Hours of PASS event (September 2015) named "Summit 2015 Preview Edition", each video is available on detail page of the related session.

    You can find all information you need on the PASS Summit 2015 website and for your convenience, here some useful links:

    See you there!

  • 24 Hours of PASS (September 2015): Recordings Available! #24HOP

    The Sessions of the event 24 Hours of PASS named "Summit 2015 Preview Edition" (which was held the last September 2015 on 17th and 18th) were recorded and now they are available for online streaming!

    If you have missed one session in particular or the entire event, you can view or review your preferred sessions. Each video is available on detail page of the related session.

    What could you aspect from the next PASS Summit 2015? Find it out on recorded sessions of this edition of 24 Hours of PASS!

  • The series Microsoft #Azure Essentials: Free #eBooks

    From the beginning of this year, Microsoft Press (@MicrosoftPress) has been published a series of free eBooks about the services available on the Microsoft Azure platform. The eBooks are listed below:
     
     
    Microsoft Azure Essentials: Fundamentals of Azure
     
     
     
    This ebook covers the fundamentals of Azure you need to start developing solutions right away. It concentrates on the features of the Azure platform that you are most likely to need to know rather than on every feature and service available on the platform. This ebook also provides several walkthroughs you can follow to learn how to create VMs and virtual networks, websites and storage accounts, and so on. In many cases, real-world tips are included to help you get the most out of your Azure experience… Read more and download.
     
     
    Microsoft Azure Essentials: Azure Automation

     
    This ebook introduces a fairly new feature of Microsoft Azure called Azure Automation. Using a highly scalable workflow execution environment, Azure Automation allows you to orchestrate frequent deployment and life cycle management tasks using runbooks based on Windows PowerShell Workflow functionality. These runbooks are stored in and backed up by Azure. By automating runbooks, you can greatly minimize the occurrence of errors when carrying out repeated tasks and process automation… Read more and download.
     
     
    Microsoft Azure Essentials: Azure Machine Learning
     
     
    Microsoft Azure Machine Learning (ML) is a service that a developer can use to build predictive analytics models (using training datasets from a variety of data sources) and then easily deploy those models for consumption as cloud web services. Azure ML Studio provides rich functionality to support many end-to-end workflow scenarios for constructing predictive models, from easy access to common data sources, rich data exploration and visualization tools, application of popular ML algorithms, and powerful model evaluation, experimentation, and web publication tooling… Read more and download.
     
     
    Microsoft Azure Essentials: Azure Web Apps for Developers

     
    Azure Web Apps is a fully managed platform that you can use to build mission-critical web applications that are highly available, secure, and scalable to global proportions. Combined with first-class tooling from Visual Studio and the Microsoft Azure Tools, the Azure Web Apps service is the fastest way to get your web application to production. Azure Web Apps is part of the Azure App Service that is designed to empower developers to build web and mobile applications for any device… Read more and download.
     
    Enjoy the eBooks on Microsoft Azure platform! 
  • 24 Hours of PASS (September 2015): Summit Preview Edition

    Which sessions can you expect to find at the next PASS Summit 2015? Find it out on September 17, 2015, 12:00 UTC at the Summit 2015 Preview Edition of the 24 Hours of PASS.
     
     
     
    Search your preferred sessions by track: 

    Register now at this link; thanks to the sponsors, the 24 Hours of PASS is presented at no cost!

    This edition of 24 Hours of PASS wants to be a sneak taste what you can expect from the next PASS Summit 2015 that this year will be in Seattle (WA) from 27 to 30 October 2015.
     
    No matter from what part of the world you will follow the event, the important thing is to know that they will be 24 hours of continuous training on SQL Server and Business Intelligence on your desk!
  • SQL Server 2016 Video Pills

    This is my first curation since Microsoft Curah! has been moving to a new Microsoft platform named docs.com.

    SQL Server 2016 Video Pills is a collection of videos about the most important features of SQL Server 2016; speakers are Engineers and Program Manager of the SQL Server Team!

    Would you like to discover the new features of SQL Server 2016? Watch these videos!

    Enjoy!

  • 24 Hours of PASS (June 2015): Session recordings now available!

    Sessions of the event 24 Hours of PASS: Growing Our Community (which was held on last June 24-25) were recorded and now they are available for online streaming!

    If you have missed one session in particular or the entire event, you can view it or review your preferred sessions.

    All details are available here.

    Remember, the next PASS Summit will be held in Seattle (WA) on October 27-30, 2015.. don't miss it!

    Enjoy!

  • 24 Hours of PASS (June 2015)

    The most important free on-line event on SQL Server and Business Intelligence is back!

    The 24 Hours of PASS is coming back with a great edition that will provide best practices, expert tips, and demos, from new and up-and-coming Speakers. Don't take commitments on 24 and 25 of June 2015.

    Register now at this link, it's free!

    No matter from what part of the world you will follow the event, the important thing is to know that it will be 24 hours of continuous training on SQL Server and Business Intelligence.

    Enjoy! 

  • SQL Server 2016 Public Preview

    It's official, the first public preview of SQL Server 2016 will come this summer. Details are available on the SQL Server Blog:

    If you want to know more about capabilities of SQL Server 2016 visit the SQL Server 2016 preview page.

  • SQL Server DML Triggers - Tips & Tricks - Part 4 ("Statements that each Trigger should have")

    This is the fourth blog post in the series dedicated to the Tips and Tricks of DML Triggers development.

    The previous parts can be found here:


    The statements that each Trigger should have

    A Trigger is optimized when its duration is brief, it always works within a Transaction and its Locks will remain active till the Transaction will be committed or rolled back. As you can imagine, the more time the Trigger needs to execute, the higher the possibility that the Trigger will lock another process in the system will be.

    The first thing you have to do to ensure that the Trigger execution will be short is to establish if the Trigger has to do something or not. If there are no rows affected in the statement that has called the Trigger, this means that there are no things for the Trigger to do. So, the first thing that a Trigger should do is to check the number of rows affected by the previous statement.

    The system variable @@ROWCOUNT allows you to know how many rows have been changed by the previous DML statement. If the previous DML statement hasn't changed the rows, the value of the system variable @@ROWCOUNT will be zero, so that there are no things that the Trigger has to do except giving back the control flow to the caller by the RETURN (T-SQL) command.

    The following piece of code should be placed at the beginning of all Triggers.

    IF (@@ROWCOUNT = 0)
      RETURN;

    Checking the @@ROWCOUNT system variable allows you to verify if the number of rows affected is the number you expect, if not, the Trigger can give back the control flow to the caller.

    In a Trigger active on multiple statement, you can query the virtual table Inserted and Deleted to know the exact number of inserted and updated (or deleted) rows.

    After that, you should consider that for each statement executed, SQL Server sends back to the client the number of rows affected, so if you aren't interested about the number of rows affected by each statement within a Trigger, you can set to ON the NOCOUNT option at the beginning of the Trigger and at the end you can flip back the value to OFF. In this way, you will reduce network traffic dramatically.

    In addition, you could check if interested columns are updated or not. The UPDATE (T-SQL) function allows you to know if the column passed by is updated or not (within an update Trigger) and if the column is involved into an INSERT statement (within an insert Trigger). If the column is not updated, the Trigger has another chance to give back the control flow to the caller or it goes on. In general, an update Trigger has to do something when a column is updated and its values are changed; if there are no changed values, probably the Trigger has another chance to give back the control flow to the caller. You can check if the values are changed by querying the virtual tables Inserted and Deleted.

    Remember, a Trigger is optimized when its duration is brief!

  • Free eBook: Data Science in the Cloud with Azure Machine Learning

    Microsoft has recently presented the cloud platform: Azure Machine Learning.

    Azure Machine Learning provides an easy-to-use and powerful set of cloud-based data transformation and machine learning tools.

    If you want to know more, you can download, for free, the eBook Data Science in the Cloud with Microsoft Azure Machine Learning and R that covers the basics of manipulating data, as well as constructing and evaluating models in Azure Machine Learning.

    Enjoy the book!

  • Claim your free eBooks about JavaScript, CSS, HTML and Hadoop

    Packt Publishing gives you the oportunity to download four free eBooks about:

    • JavaScript
    • CSS
    • HTML
    • Hadoop

    Claim your copies here.

  • SQL Server DML Triggers - Tips & Tricks - Part 3 ("Trigger debug")

    This is the third blog post in the series dedicated to the Tips and Tricks of DML Triggers development.

    The first and the second parts can be found here:


    Trigger debug 

    The most important Programming Languages have debugging tools integrated into the development tool. Debugger usually has a graphic interface that allows you to inspect the variables values at run-time to analyze source code and program flow row-by-row and finally to manage breakpoints.

    Each developer loves debugging tools because they are very useful when a program fails in a calculation or when it returns into an error. Now, think about a Trigger that performs a very complex operation silently. Suppose that this Trigger works into a problem; probably, this question comes to your mind: "Can I debug a Trigger" and if it is possible, "How can I do it?". Debugging a Trigger is possible with Microsoft Visual Studio development tool (except Express edition).

    Consider the first version of the Trigger Production.TR_Product_StockLevel in the my previous blog post SQL Server DML Triggers - Tips & Tricks - Part 2 ("Thought to work on multiple rows"):

    USE [AdventureWorks2014];
    GO

    CREATE TRIGGER Production.TR_Product_StockLevel ON Production.Product
    AFTER INSERT AS
    BEGIN
      /*
        Avoid to insert products with value of safety stock level lower than 10
      */
      BEGIN TRY
        DECLARE
          @SafetyStockLevel SMALLINT;

        SELECT
          @SafetyStockLevel = SafetyStockLevel
        FROM
          inserted;

        IF (@SafetyStockLevel < 10)
          THROW 50000, N'Safety Stock Level cannot be lower than 10!', 1;
      END TRY
      BEGIN CATCH
        IF (@@TRANCOUNT > 0)
          ROLLBACK;
        THROW; -- Re-Throw
      END CATCH;
    END; 

    As you probably have already seen, the first version of that Trigger doesn't work well with multiple rows because it hadn't been thought to work with multiple rows. The customer in which we deployed that Trigger complains that some products have the safety threshold saved in the SafetyStockLevel column lower than 10. Now we have to debug that DML AFTER INSERT Trigger, below here you will learn how to do it.

    The first step to debug a Trigger is to create a stored procedure that encapsulates the statement that is able to fire the Trigger that you want to debug. Right, we have to create a stored procedure that performs an INSERT statement to the Production.Product table of the AdventureWorks2014 database. The following piece of T-SQL code creates the Production.USP_INS_PRODUCTS stored procedure in the AdventureWorks2014 database.

    USE [AdventureWorks2014];
    GO

    CREATE PROCEDURE Production.USP_INS_PRODUCTS
    AS BEGIN
      /*
        INSERT statement to fire Trigger TR_Product_StockLevel
      */

      INSERT INTO Production.Product
      (

       Name, ProductNumber, MakeFlag, FinishedGoodsFlag, SafetyStockLevel,
       ReorderPoint, StandardCost, ListPrice, DaysToManufacture,
       SellStartDate, rowguid, ModifiedDate

      )
      VALUES
      (

       N'BigBike8', N'BB-5388', 0, 0, 10 /* SafetyStockLevel */,
       750, 0.0000, 78.0000, 0, GETDATE(), NEWID(), GETDATE()

      ),
      (

       N'BigBike9', N'BB-5389', 0, 0, 1  /* SafetyStockLevel */,
       750, 0.0000, 62.0000, 0, GETDATE(), NEWID(), GETDATE()

      );
    END;

    The second step consists in the execution of the stored procedure, created in the previous step, through Microsoft Visual Studio.

    Open Microsoft Visual Studio and surf into SQL Server Object Explorer, open the AdventureWorks2014 database tree, expand Programmability folder and try to find out the Production.USP_INS_PRODUCTS stored procedure into "Stored Procedures" folder. Next, press right click on the stored procedure Production.USP_INS_PRODUCTS, a context pop-up menu will appear and when you select the item "Debug Procedure…", a new SQL Query page will be open and it will be ready to debug the stored procedure as you can see in the following picture.

    Picture 1 – Debugging the USP_INS_PRODUCTS stored procedure through Microsoft Visual Studio

     

    The execution pointer is set to the first executable instruction of the T-SQL script automatically generated by the Visual Studio Debugger Tool. Using step into debugger function (F11) you can execute the Production.USP_INS_PRODUCTS stored procedure step-by-step up to the INSERT statement that will fire the Trigger you want to debug. If you press step into button (F11) when the execution pointer is on the INSERT statement, the execution pointer will jump into the Trigger, on the first executable statement, as shown in the following picture.

    Picture 2 – A breakpoint within a Trigger

     

    Debugger execution pointer is now on the first executable statement of the Trigger, now you can execute the Trigger's code and observe variables content step-by-step. In addition, you can see the exact execution flow and the number of rows affected by each statement. If multiple Triggers fire on the same statement, the "Call Stack" panel will show the execution chain and you will be able to discover how the Trigger's code works.

    Enjoy the debugger! :)

    All examples in this article are based on AdventureWorks2014 database that you can download from codeplex website at this link.

  • SQL Server DML Triggers - Tips & Tricks - Part 2 ("Thought to work on multiple rows")

    This is the second blog post in the series dedicated to the Tips and Tricks of DML Triggers development.

    The first part can be found here: SQL Server DML Triggers - Tips & Tricks - Part 1 ("Intro" & "Triggers activation order").

     

    Triggers must be thought to work on multiple rows

    One of the most frequent mistake I have seen during my experience in Triggers debugging and tuning is: the Author of the Trigger doesn't consider that his Trigger will work on multiple rows, sooner or later! I have seen many Triggers, especially those ones that implement domain integrity constraints, which were not thought to work on multiple rows. This mistake, in certain cases, produces the storing of incorrect data (an example will follow).

    Suppose that you have to develop a DML AFTER INSERT Trigger to avoid to store values lower than 10 in the SafetyStockLevel column of the Production.Product table in the AdventureWorks2014 database. This customized business logic may be required to guarantee no production downtime in your company when a supplier is late in delivering.

    The following piece of T-SQL code shows the CREATE statement for the Trigger Production.TR_Product_StockLevel.

    USE [AdventureWorks2014];
    GO

    CREATE TRIGGER Production.TR_Product_StockLevel ON Production.Product
    AFTER INSERT AS
    BEGIN
      /*
        Avoid to insert products with value of safety stock level lower than 10
      */
      BEGIN TRY
        DECLARE
          @SafetyStockLevel SMALLINT;

        SELECT
          @SafetyStockLevel = SafetyStockLevel
        FROM
          inserted;

        IF (@SafetyStockLevel < 10)
          THROW 50000, N'Safety Stock Level cannot be lower than 10!', 1;
      END TRY
      BEGIN CATCH
        IF (@@TRANCOUNT > 0)
          ROLLBACK;
        THROW; -- Re-Throw
      END CATCH;
    END;
    GO

    A very good habit, before applying Triggers and changes (in general) in the production environment, is to spend time to test the Trigger code, especially for the borderline cases and values. So, in this example you have to test if this Trigger is able to reject each INSERT statement that tries to store values lower than 10 into SafetyStockLevel column of the Production.Product table. The first test you can do, for example, is trying to insert one wrong value to observe the error caught by the Trigger. The following statement tries to insert a product with SafetyStockLevel lower than 10.

    USE [AdventureWorks2014];
    GO

    -- Test one: Try to insert one wrong product
    INSERT INTO Production.Product
    (

     Name, ProductNumber, MakeFlag, FinishedGoodsFlag, SafetyStockLevel,
     ReorderPoint, StandardCost, ListPrice, DaysToManufacture,
     SellStartDate, rowguid, ModifiedDate

    )
    VALUES
    (

     N'Carbon Bar 1', N'CB-0001', 0, 0, 3 /* SafetyStockLevel */,
     750, 0.0000, 78.0000, 0, GETDATE(), NEWID(), GETDATE()

    );

    As you expect, SQL Server has rejected the INSERT statement because the value assigned to SafetyStockLevel is lower than 10 and the Trigger Production.TR_Product_StockLevel has blocked the statement.

    The output shows that Trigger worked well...

    Msg 50000, Level 16, State 1, Procedure TR_Product_StockLevel, Line 17

    Safety Stock Level cannot be lower than 10!

     

    Now you have to test the Trigger for statements that try to insert multiple rows. The following statement tries to insert two products: the first product has a wrong value for SafetyStockLevel column, whereas the value in second one is right. Let's see what happens.

    USE [AdventureWorks2014];
    GO

    -- Test two: Try to insert two products
    INSERT INTO Production.Product
    (

     Name, ProductNumber, MakeFlag, FinishedGoodsFlag, SafetyStockLevel,
     ReorderPoint, StandardCost, ListPrice, DaysToManufacture,
     SellStartDate, rowguid, ModifiedDate

    )
    VALUES
    (

     N'Carbon Bar 2', N'CB-0002', 0, 0, 4  /* SafetyStockLevel */,
     750, 0.0000, 78.0000, 0, GETDATE(), NEWID(), GETDATE()

    ),
    (

     N'Carbon Bar 3', N'CB-0003', 0, 0, 15 /* SafetyStockLevel */,
     750, 0.0000, 78.0000, 0, GETDATE(), NEWID(), GETDATE()

    );
    GO

    The output shows that the Trigger has worked well again, SQL Server has rejected the INSERT statement because in the first row the value 4 for the SafetyStockLevel column is lower than 10 and it can't be accepted.

    Msg 50000, Level 16, State 1, Procedure TR_Product_StockLevel, Line 17

    Safety Stock Level cannot be lower than 10!

     

    If you have to deploy your Trigger as soon as possible, you could convince yourself that this Trigger works properly, after all you have already done two tests and all wrong rows were rejected. You decide to apply the Trigger in the production environment; but what happens if someone or an application tries to insert two products, in which there is one wrong value put in an order that differs from the one you used in the previous test?

    Let's see the following INSERT statement in which the first row is right and the second one is wrong.

    USE [AdventureWorks2014];
    GO

    -- Test three: Try to insert two rows
    -- The first row one is right, but the second one is wrong
    INSERT INTO Production.Product
    (

     Name, ProductNumber, MakeFlag, FinishedGoodsFlag, SafetyStockLevel,
     ReorderPoint, StandardCost, ListPrice, DaysToManufacture,
     SellStartDate, rowguid, ModifiedDate

    )
    VALUES
    (

     N'Carbon Bar 4', N'CB-0004', 0, 0, 18 /* SafetyStockLevel */,
     750, 0.0000, 78.0000, 0, GETDATE(), NEWID(), GETDATE()

    ),
    (

     N'Carbon Bar 5', N'CB-0005', 0, 0, 6 /* SafetyStockLevel */,
     750, 0.0000, 78.0000, 0, GETDATE(), NEWID(), GETDATE()

    );
    GO

    The last INSERT statement has been completed successfully, but inserted data do not respect the domain constraint implemented by the Trigger, as you can see in the following picture.

    Picture 1 – Safety stock level: Domain integrity violated for product named "Carbon Bar 5"


    The safety stock level value for the product named "Carbon Bar 5" doesn't respect the business constraint implemented by the Trigger Production.TR_Product_StockLevel; this Trigger hasn't been thought to work on multiple rows. The mistake is in the following assignment line:

    SELECT
      @SafetyStockLevel = SafetyStockLevel
    FROM
      Inserted;

    The local variable named @SafetyStockLevel can contain only one value from the SELECT on the Inserted virtual table and this value will be the SafetyStockLevel value corresponding to the first row that is returned from the statement. If the first row (that one returned from the query) has a suitable value in the SafetyStockLevel column, this Trigger will consider right the others as well. In this case, not allowed values (lower than 10) from the second row on, will be stored anyway!

    How can the Trigger's Author fix this issue? He can fix it by checking SafetyStockLevel value on all rows in the Inserted virtual table, and if the Trigger finds just one value which is not allowed it will return an error. Below here, there is the version 2.0 of the Trigger Production.TR_Product_StockLevel, it fixes the issue changing the previous SELECT statement in an IF EXISTS SELECT statement.

    USE [AdventureWorks2014];
    GO

    ALTER TRIGGER Production.TR_Product_StockLevel ON Production.Product
    AFTER INSERT AS
    BEGIN
      /*
        Avoid to insert products with value of safety stock level lower than 10
      */
      BEGIN TRY
        -- Testing all rows in the Inserted virtual table
        IF EXISTS (
                   SELECT ProductID
                   FROM inserted
                   WHERE (SafetyStockLevel < 10)
                  )
          THROW 50000, N'Safety Stock Level cannot be lower than 10!', 1;
      END TRY
      BEGIN CATCH
        IF (@@TRANCOUNT > 0)
          ROLLBACK;
        THROW; -- Re-Throw
      END CATCH;
    END;
    GO

    This new version is thought to work on multiple rows and it always works properly.

    However the best implementation for this business logic is by using CHECK constraint that is the best way to implement customize domain integrity. The main reason to prefer CHECK constraints instead of the Triggers, when you have to implement customize domain integrity, is that all constraints (such as CHECK, UNIQUE and so on) will be checked before the execution of the statement that fires it. On the contrary, AFTER DML Triggers will fire after the statement has been executed. As you can imagine, for performance reason, in this scenario, the CHECK constraint solution is better than the Trigger solution.

    All examples in this post are based on AdventureWorks2014 database that you can download from codeplex website at this link.

This Blog

Syndication

Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement