THE SQL Server Blog Spot on the Web

Welcome to - The SQL Server blog spot on the web Sign in | |
in Search

Damian Widera

  • SQL Server 2016 CTP 2.3 - available for download


    The CTP2.3 is ready and waiting for you, SQL Folks! 

    The official statement (by Kevin Farlee), is  as follow:

    This release includes enhancements to Analysis Services, Database Engine, Integration Services and Master Data Services. The build also includes stability and bug fixes across the components. Several fixes address issues reported by customers in CTP2, CTP2.1 and CTP2.2. During this milestone, preview 1.1 of SQL Server 2016 Upgrade Advisor has been released for download as well.

    Feature enhancements in CTP2.3 (apart from what we have seen so far):

     1.            Analysis Services

    a.            User Interface added for Extended Events

    b.            New DAX functions for Date/Time, Filtering, Math and Trignometry, Statistics, and Text.

    c.             New behaviors for DAX variables.

    d.            DirectQuery now generates simpler better performing queries.

    2.            Database Engine Security

    a.            Row Level Security is now supported for memory-optimized tables.

    b.            Default encryption for endpoints has been changed from RC4 to AES.

    3.            Column Store Indexes

    a.            Non Clustered Column Indexes on tables with triggers and/or CDC/Change Tracking.

    b.            Performance optimizations for column store index.

    4.            Integration Services

    a.            Support for oData v4 protocol

    b.            SSIS Error Column Support

    c.             Advanced Logging levels

    5.            Master Data Services

    a.            MDS Add-in for Excel now allows Business Rules management.

    b.            Many-to-Many Relationships in Derived Hierarchies.

    c.             Merge Conflicts

    6.            SQL 2016 Upgrade Advisor Preview 1.1


    Download Link:

    You can also read the official blogs:

    Engineering CTP 2.3 blog: 







  • SQL Server 2016 Insights by Polish MVP's

    Hello SQL Folks!

    In one of the latest posts I wrote that the SQL Server 2016 & Channel 9 series that is being prepared by Polish MVP’s.

    Now we have 3 recordings there but expect many more :). You can find more details about the series here:

    The first session is dedicated to the security enhancements that will be available in the SQL Server 2016. This session was prepared by Marcin Szeliga, our PLSSUG President. The link to the recordings is here:

    The second session is about hybrid scenarios and especially about a new feature called Stretch Database. This recordings was done by Bartek Graczyk. The link to the recordings is here:

    The third sessions was prepared by me! This session is about monitoring data changes using Temporal Tables. The link to the recordings is here:


    Well, I think it is a good oportunity to learn about SQL Server 2016 before it is already available :) Please stay tuned as more recording will be posted soon! 

    Oh BTW - the sessions are in Polish but I promise to write a blog post about Temporal Table feature in English this week!




  • Stellar Phoenix SQL Database Repair - first look / review

    Today I was able to play with the Stellar Phoenix SQL Database Repair a little bit

    What is Stellar Phoenix SQL Database Repair

    According to the website the tool is able to fight with these problems:

    ·         Repairs corrupt MDF and NDF file of SQL Server database

    ·         Recovers tables, triggers, indexes, keys, constraints, rules, and defaults

    ·         Restores SQL Server ROW-compressed and PAGE-compressed data

    ·         Supports MS SQL Server 2014, 2012, 2008 and all lower versions


    I took a simple database data file that already has a corruption. If you tried to attach the file to the server you would see the error:

    The log cannot be rebuilt because there were open transactions/users when the database was shutdown, no checkpoint occurred to the database, or the database was read-only.

    This error could occur if the transaction log file was manually deleted or lost due to a hardware or environment failure.


    As I had only the data file I was trying to attach the database using the sp_attach_single_file_db system stored procedure first.

    I opened the tool and the first information is as follow:

    That makes sense as it is hard to perform any operations on database files if those are under the control of the database engine. The only way is to take the data file or files away from the SQL Server.

    The next step is to select the data files. There are two possible ways to do so as you can see on the picture below:

    -        You can point in to the database

    -        You can search for it


    I chose with the first option and clicked the “Repair” button. The next and last question is about the database version. That step is necessary as metadata of SQL Server databases changed over the time. I picked SQL Server 2014 and clicked “OK”.


    Now the application started to work out and after a while I was informed that it is possible to repair my database. The applicaton presented the short report of all actions it had taken. 

    It is possible to take a look at the database, examine all objects and even to see the data:


    At this point it is possible to repair and save the database (means also to attach) to the given instance of SQL Server.

     And the magic happened J The database is up and running!



    I really like the very simple and very intuitive interface. It is that simple that you even don’t have to remember all the fancy options as I imagine such program might have. The application just repairs the database in an automatic way J.


    The Product Webpage:

  • My MVA course about Machine Learning is available!

    So I did it again!  The new MVA course is ready :) and waiting for you. This time I briefly explain what and what for the Azure Machine Learning is J

    Just visit the link and enjoy. Watch out - it's in Polish :)

    The topics covered:

    ·         Machine Learning Introduction

    ·         Machine Learning for business – why and what for?

    ·         Machine Learning for

    more comming....

     One of the next MVA course I am going to prepare will be about... wait for it.....  Oracle & Azure ;)


    And stay tuned for the SQL Server 2016 Channel 9 series that is being prepared by Polish MVP’s:. More details:

    I will have two recordings there very soon  






  • SQL Server 2014 Service Pack 1 is available (again)!

    I just got notified that the SQL Server 2014 SP1 is available for download.  I was a part of "testing team" of this new / updated SP1 and found no issues! Hope you will not found them, too

    Below you can find the official information about the SP1:

    As previously communicated, the Service Pack 1 release on April 15 was recalled very shortly after going live on the Microsoft Download Center (only) to address a recall issue.  That issue has been resolved, and after complete recertification is now available.  The new release is v12.0.4100.1, KB3058865.

    Approximately one year ago, we launched SQL Server 2014. Today, May 15, we are pleased to announce the release of SQL Server 2014 Service Pack 1 (SP1). The Service Pack is initially available for download on the Microsoft Download Center. SQL Server 2014 SP1 will be rolling out to additional venues including MSDN/TechNet, the Volume Licensing center, and Microsoft Update starting May 21, 2015.

    SQL Server 2014 SP1 contains fixes provided in SQL Server 2014 CU 1 up to and including CU 5, as well as a rollup of fixes previously shipped in SQL Server 2012 SP2. For highlights of the release, please read the Knowledge Base Article for Microsoft SQL Server 2014 SP1.

    As part of our continued commitment to software excellence for our customers, this upgrade is available to all customers with existing SQL Server 2014 deployments via the download links below.………






  • How to rename a column when CDC is enabled

    I have been recently asked if Always On supports metadata operations on tables. The operation is add, drop and column rename. I said that it was possible. However I was informed that during column rename there was an error about “replication”:

    Caution: Changing any part of an object name could break scripts and stored procedures.Msg 4928, Level 16, State 1, Procedure sp_rename, Line 655. Cannot alter column 'Name' because it is 'REPLICATED'.

    I know that replication is not used in the scenario so my next guess was that this must be caused by the CDC. And I was right. Let’s do a simple example to see how the CDC works and what happened if you try to rename a column.

    I downloaded the AdventureWorks2014 database from the codeplex page and enabled the CDC in this database. I created also a role that could be used to have access to the CDC data. However it is not important in our example.

     USE AdventureWorks2014;


    CREATE ROLE [cdc_admin]


    EXEC sys.sp_cdc_enable_db



    Next I configured that all CDC-related objects are stored in the separate filegroup – that is for performance reason:

    ALTER DATABASE AdventureWorks2014

    ADD FILEGROUP [CDC_Objects];


    ALTER DATABASE AdventureWorks2014

    ADD FILE (    NAME = N'CDC_Objects',

                         FILENAME = N'C:\Temp\Adwentureworks_cdc.ndf',

                         SIZE = 1024MB , FILEGROWTH = 1024MB )

    TO FILEGROUP [CDC_Objects];


    The last configuration step is to choose the table that will be under the control of the CDC feature.  You might indicate the role, capture instance name (which is important in case where there are two CDC tables) and filegroup name at the moment.

    EXEC sys.sp_cdc_enable_table

           @source_schema = N'Sales',

           @source_name = N'Store',

           @role_name = N'cdc_admin',

           @capture_instance = N'SalesStore',

           @filegroup_name = N'CDC_Objects';


    As the CDC configuration is done I was trying to rename on of the column that belongs to the table Sales.Store:

    EXEC sys.sp_rename 'Sales.Store.Name' , 'SomeNewBetterName', 'COLUMN'


    I was immediately notified by the error:

    Caution: Changing any part of an object name could break scripts and stored procedures.

    Msg 4928, Level 16, State 1, Procedure sp_rename, Line 655

    Cannot alter column 'Name' because it is 'REPLICATED'.


    I checked the sys.columns view and that was true – the flag is_replicated was set to all columns that are configured for the CDC.

    It makes perfect sense for me as the CDC feature uses log reader agent J to detect changes.

    Change data capture and transactional replication always use the procedure sp_replcmds to read changes from the transaction log.


    SELECT is_replicated FROM sys.columns

    WHERE object_id = object_id(N'Sales.Store')

    AND name = N'Name'


    What should we do in case of column rename? There is only one way I am aware of. First the CDC must be disabled for the table. But be careful before you run the statement because all related CDC tables will be dropped immediately and without any warning J

    EXEC sys.sp_cdc_disable_table

           @source_schema = N'Sales',

           @source_name = N'Store',

           @capture_instance = N'all';



    Finally it is possible to rename the column.

    EXEC sys.sp_rename 'Sales.Store.Name' , 'SomeNewBetterName', 'COLUMN'


    If all changes are done the CDC should be enabled again

    EXEC sys.sp_cdc_enable_table

           @source_schema = N'Sales',

           @source_name = N'Store',

           @role_name = N'cdc_admin',

           @capture_instance = N'SalesStore',

           @filegroup_name = N'CDC_Objects';



    And that's all for today! 




  • Playing with the Filestream storage on Sunday

    Today I was asked to do a pretty easy thing. The task was to add a filestream support to a table that already stores some documents. The customer wants to keep the data outside the database because he uses the SQL Server 2014 Express edition. According to the MSDN page there is limitation of 10GB that can be stored in this edition and filestream data are not taken into the consideration to this limit.

    But let’s go back to work now. The first thing we would have to do is to enable filestream support on the server level. That can be done using the SQL Server Configuration Manager or by sp_configure stored procedure. I prefer the second way. Please remember – you have to be a sysadmin to configure that.

    use demo_db



    EXEC sys.sp_configure N'filestream access level', N'2'





    The second step is to make the database aware of the filestream support. To do so you need to add a new filegroup to this database. The filegroup is a special one as it will contain the filestream data. By the way – it is of course possible to have more than one such filegroup in a database but only one of them can be marked as default.




    When a filegroup is in place then it’s time to add file into it.  There is also possible to have more than one file in the filestream filegroup.

    ALTER DATABASE  demo_db



        Name = demo_dbFS,

        FILENAME = 'C:\dbdemo',

        Maxsize = UNLIMITED

    ) TO FILEGROUP demoFS;


    Now it’s time to get into the table level. The customer has one table that should contain the filestream data.  Now the data are stored in the column that is called Data and is type of varbinary(MAX). In order to have filestream aware table you have to do 3 things:

    • Add a column that is a type of uniqeidentifier and is marked as rowguidcol.

    • Set filestream_on option on the table

    • Add new column of varbinary(MAX) type and mark it for filestream data


    Follow the three statements below:

    ALTER TABLE dbo.Documents

    ADD filestreamguidcol uniqueidentifier not null ROWGUIDCOL unique default newid()



    ALTER TABLE dbo.Documents

    SET (filestream_on=demoFS)



    ALTER TABLE dbo.Documents

    Add DataFS varbinary(max) FILESTREAM null



    After we did the necessary setup now is the time to transfer data from the existing column (Data) to the filestream column (DataFs). I did it  by running the simple UPDATE command:

    UPDATE dbo.Documents

    SET DataFS = Data



    Now it’s safe to drop the Data column:

    ALTER TABLE dbo.Documents

    DROP COLUMN [Data]



    The last step in this example is to rename the new column (DataFS) to the one that existed and was dropped in the previous step (Data) so I do not have to change the existing application data access layer. If you wish you could refresh all dependent objects at the end.

    EXEC sp_rename

        @objname = 'Documents.DataFS',

        @newname = 'Data',

        @objtype = 'COLUMN'


    By the way – there are at least two great sources of information (also internals) and some myth busters done by Paul Randal and Bob Beauchemin.



  • My MVA course about SQL Server on Azure - PaaS or Iaas?

    If you are tired today maybe you could take your time and spend the whole weekend with the SQL Server 2014 and Azure?

    The new MVA course is ready :) and waiting for you. Just visit the link and enjoy. Watch out - it's in Polish :)

    The topics covered:

    • SQL Server as IaaS  - means SQL Server on the Azure VM
    • SQLServer as PaaS - means Azure SQL Database
    • IaaS vs PaaS - what to choose and when

    more comming....


    One of the the next MVA course I am going to prepare will be about... wait for it.....  Oracle & Azure ;)




  • SQLRally Nordic - after conference thoughts

    Hello All

    I was extremely honored to be able to participate in the SQLRally Nordic this year. The event took place this week in Copenhagen and I think it was a great success - from every angle. Many thanks to the organizers and many thanks to the community for being in that place. There was a great atmosphere during the whole event and what I like the most as a speaker - I had a great room for my speech! 

    I hope that the event will take place also next years but now I would like to invite to the annual PLSSUG conference to Wrocław. Let's meet in May. At least half of the tracks will be in English and you could meet also may world-class speakers. Just visit the page




  • Memory leak occurs when you run DBCC CHECKDB against a database in SQL Server 2014 - fixed in SQL Server 2014 CU6

    Great news today!

    At the end of January it was a Connect item  ( that described a possible memory leak on the SQL Server 2014.

    The details are as follow: 

    sys.dm_os_memory_clerks for type = MEMORYCLERK_SQLQUERYEXEC and sys.dm_os_memory_objects for type = MEMOBJ_COLUMNBATCH show ever increasing values when columnstore tables are integrity checked. For example, if you run DBCC CHECKDB on a database that contains columnstore tables or you run DBCC CHECKTABLE on a columnstore table you will find that the memory shown for the above mentioned clerk and object types is ever increasing.  

    The good news is that the problem is fixed and the solution will be available in the SQL Server 2014 CU6. Find more details: 





  • SQL Server 2012 SP2 Cumulative Update 4 Released - information from the SQL Server SE Operations Team

    As per information from Microsoft:

    "It is my pleasure to announce the release of SQL Server 2012 SP2 Cumulative Update 4 on behalf of the team.  SQL Server 2012 SP2 Cumulative Update 4 incorporates 42 issues.

    This CU will be available for download from the associated cumulative KB article that has also been published.  Customers are directed to contact CSS to get the CU build or obtain the hotfix package through the new self-service feature by clicking on the “Hotfix Download Available” button found at the top of the KB article."

    To me the most important fix is the one regarding the possibility of having errors 17066 or 17310 during SQL Server startup (immediately after database recovery is complete and client connections are enabled.). Check out the link for details

    Link to the official site of this update:




  • Kimberly Tripp is going to visit PLSSUG in January (online meeting)

    Paul Randal has recently proposed that SQLSkills could give sessions on the user groups meetings. I emailed him immediately and - making the long story short - the PLSSUG (Polish SQL Server Users Group) will be hosting Kimberly Tripp on 29 th January at 6PM local time. Please visit the  community webpage 

    Here are some details:

    Stored Procedure Optimization Techniques

    Presenter: Kimberly L. Tripp

    Abstract: Kimberly will discuss stored procedure caching and the potential for performance problems due to a problem known as parameter sniffing. There are some common solutions and while they can drastically improve performance, they can also cause other problems. Sometimes a hybrid solution is needs, from this session, you’ll know the best way to write procedures so that stable requests are cached and unstable requests are not (giving you the most optimal performance without wasting resources).


    Bio: Kimberly L. Tripp, President / Founder,

    Kimberly has been a Microsoft SQL Server MVP and a Microsoft Regional Director for over 10 years; she's an author, consultant, and top-rated speaker having spoken at PASS, TechEd, SQLintersection, and numerous other events around the world. She has over 20 years of experience designing and tuning SQL Server solutions with a focus on optimizing procedural code, indexing strategies, statistics, and partitioning. When she's not tuning SQL servers, she's diving and photographing all sorts of underwater critters such as frogfish with her husband, Paul S. Randal.

    I hope that this is great news.

    And I promise - this will not be the only meeting :).

    More details will follow on the community webpage (as well as information how to participate in that meeting) -






  • SQLDay 2015 - should we plan it now? Yes - check out who will join us :)

    SQL Folks, SQL Geeks and others....

    Is it the right time  to plan the next conference when the previous was finished last week? YES!!!! It is the right time!!! The PLSSUG is planning to have the biggest community conference ever. It will start at 11th May (Monday - preconf) and will be till Wednesday, 13th May (at least). The first great news is that Adam Machanic will be with us. So do not wait and register for his preconf and sessions :). Adam will not be the only superstar next year but more details will follow. Remember, that this year Bob Ward, Klaus Aschenbrenner and Alberto Ferrari were the special guests. And do not forget about the local leaders - you should have seen them last week in Lublin. So you know what am I talking about.

    So, stay tuned, vist the PLSSUG page frequently





  • Conference - ITAD Bielsko Biała

    Good evening

    Today I had a lecture on the academic conference ITAD ( That is amazing that 300  people came and took part in the event. 

    I was talking about the SQLCLR which seems to be my main topic in the last months. That is because I use it over and over again in my projects. The idea was to share best practices to the people who probably start practicing this feature.  Many thanks to the organizers for invitation, congratulations once again - it was really impresive event. Hope to see you next year as I promised you to show you some TSQL stuff that makes SQL Server fly (not cry).

    If you want to see what I was talking about you can download the materials from the drive  . Many thanks to Adam Machanik for help in preparing the demos and sharing knowledge about the SQLCLR 



  • SQL Day Lite 2014 Lublin - part II - photos

    Hello again

    I am pleased to announce that pictures taken during the conference are available. Just take a look here. Thanks to Kamil Nowiński the photos.

    It is also a good moment to say "thanks" to the PLSSUG (Polish SQL Server Users Group) for the organization. Again - it was fabulous :) Over 100 people were attending the Saturday's conference and also over 45 were with us on Friday during the workshops (preconf). Please visit the PLSSUG site for the recorded sessions (comming soon...)

    The agenda of the conference was as follow:

    Łukasz Grala - SQL Server Analysis Services - Multidimensional vs Tabular

    me - All you wanted to know about the DBCC 

    Grzegorz Stolecki - Reporting Services - the rendering report extensions

    Łukasz Grala - What everybody (not only DBA's) should know about the backup

    Marek Adamczuk & Paweł Potasiński - Windowing Functions

    Maciej Pilecki - Event Notification 




More Posts Next page »
Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement