THE SQL Server Blog Spot on the Web

Welcome to - The SQL Server blog spot on the web Sign in | |
in Search

Denis Gobo

  • SQL Server Application and Multi-server Private CTP Announced

    Microsoft is looking for participants in entities with at least 1 DBA, more than 25 PCs, and SQL Server installations across their organization. As part of SQL Server “Kilimanjaro”, they have announced baseline investments for application and multi-server management. They are recruiting customers to participate in a private CTP with SQL Server Engineering. Registration for the Private CTP will take place until December 15, 2008 and the 4-5 week Private CTP will begin in mid-January


    Your SQL Server 2008 trial accounts includes:
    Hyper-V Image
    Reporting and Analysis Services
    Integration Services
    Full System Admin Rights


    More details and sign up links can be fond here:


    See you in the cloud (or was that SkyNet) :-)


    P.S. waiting for Jason Massie's comment how this is the beginning of the end for the DBA  ;-)



  • Cumulative update package 2 for SQL Server 2008 has been released

    Cumulative update package 2 for SQL Server 2008 has been released, this build of the cumulative update package is also known as build 10.00.1779.00

    To see all the fixes and how to obtain Cumulative update package 2 for SQL Server 2008 go here:

  • Microsoft SQL Data Services public CTP announced

    The Microsoft SQL Data Services public CTP has been announced.

    Microsoft® SQL Data Services (SDS) offers highly scalable and Internet-facing distributed database services in the cloud for storing and processing relational queries. SDS can help you develop and provision new applications quickly with REST and SOAP based web protocols. The services are built on robust SQL Server database and Windows Server technologies, providing high availability and security.

     This is all part of the azure platform, the CTP is free but you do need a Credit Card for verification purposes

    Visit the SQL Data Services Dev Center

    Try the SQL Data Services Public CTP



  • Some of my reasons for upgrading to SQL Server 2008

    So Aaron posted My reasons for upgrading to SQL Server 2008


    I thought it was an interesting list, not as interesting as the one by Jason Massie (10 Reasons to Upgrade to SQL Server 2008) but still interesting :-) I would like to give you my list.


    Date data type

    We have data that goes back to 1896 so we cannot use smalldatetime instead of datetime. When your table has billions of rows and grows by millions of rows a day this will save a lot of storage. The time data type can also be beneficial if you need to find all customers that placed orders between 4 and 5 PM in the last 5 years.


    Partition-Aware Seek Operation also known as skip scan

    I still have to do some more testing with this but here is what it basically does:  the query processor performs a first-level seek operation to find every partition that contains rows that meet the condition This identifies the partitions to be accessed. Within each partition identified, the processor then performs a second-level seek into the clustered index. More about this is explained here:


    Row and Page Level compression

    I have tested this and for some tables performance is better and for some it is worse, you will need to test to see if it makes sense to turn this on or not. If you have lots of repeated values on the page then it should benefit you. I will have a blog post on this before the economy recovers (I just bought myself some time here)


    Transact-SQL Row Constructors

    If you have ever had to allow people to upload files, parse these files on a web server and import this data you will appreciate Row Constructors. No longer do you need to call a proc for every row in the file, after that checking that the rowcount is the same in the table as in the file. You will be able to insert a ton of rows in one shot, this will make the process much faster and it is also part of one transaction, they all go in or they don’t. BULK INSERT is of course better but most likely you only have 1 port open between the web server and the SQL server (and if you don’t then you should)


    C# in SSIS

    This might not seem as a big deal but switching between C# and VB can be a pain in the neck. If you are a developer who does 50% SQL development and 50% C# windows forms/web forms/MVC development then you don’t have to switch modes in your head every time you go from SQL to web. If you are doing VB then of course this doesn’t apply to you.


    Table-Valued Parameters

    Table-Valued Parameters are nice because you don’t have to create temp tables anymore before calling the proc so that you can read from the temp table after the proc has executed. I always felt ‘dirty’ doing that and it felt like a hack.


    Other new T-SQL things.

    Not as important but still nice are the MERGE (UPSERT) statement, hierarchyid Data Type and Compound Operators. I am sure some people will love the geometry and geography spatial data types, I have no use for them right now


    Other new SQL Server things

    A couple of other things which I find interesting are:

    On partitioned tables, you can configure locks to escalate to the partitions instead of to the whole table by using the LOCK_ESCALATION option of ALTER TABLE.

    The FORCESEEK table hint.

    Resource Governor

    SQL Server Extended Events



    There you have it, I will be upgrading on some machines from 2000 to 2008 so this list will be even bigger for those instances. Actually these will be new machines as well.

  • Windows Server 2008 R2 Will Support 256 Logical Processors And Will Be 64 Bit Only

    Microsoft at the Windows Hardware Engineering Conference (WinHEC)  announced that Windows Server 2008 R2 will be 64 bit only and also that Windows Server 2008 R2 will support 256 logical processors

    Windows Server 2008 R2 also supports more than 64 processors, so customers can take advantage of advancements in hardware. Coupled with SQL Server 2008, this enables an optimum use of hardware to parallelize tasks and boost performance.

    Now that is pretty nice, I wonder what the license would be for 256 Logical Processors? At least you can run that Intel 80 core CPU on windows now, Windows pre 2008 R2 'only' supported up to 64 Logical Processors.


    Read the Q&A with Bill Laing here:



  • Interview With Denny Cherry About SQL Server

    I have known Denny Cherry for a while now, consider him a friend and was always impressed with his SQL Server knowledge. I asked him to do an interview and as you can see he agreed.

    Denny, the first time I came across your name was in online forums. Can you explain to us why you participate in newsgroups and forums?

    I feel that as someone who has learned a decent amount about Microsoft SQL Server, it’s my responsibility to help make sure that others have access to the same information.  I could easily keep the information to myself, but in doing so I’ll only be shooting myself in the foot later on.  At some point in the future I’m sure I’ll end up in management at some level, and I’ll want DBAs that have the knowledge working for me.  I feel that the best way to do that is to make that the information stays out there circulating.  That and it’s fun talking to other people who work on databases.

    Name three things that are new in SQL Server 2008 that you find are the most valuable?

    The new features which I’d say top my list are the Resource Governor, the Data Collection functions, and the new data types which have been added.


    What are some of your favorite high availability features in SQL Server?

    Clustering is my personal favorite high availability technique to use with SQL Server.  It is a great way to create a local HA cluster.  When going between data centers clustering gets very difficult to do if the data centers are more than a couple hundred miles apart which is where I really like to see Database Mirroring step up and take the workload.  Between the two techniques you could create a database solution which is virtually impervious to either a system failure or a datacenter failure.

    Can you list any third party tools that you find useful to have as a SQL Server developer/admin?

    I am a huge fan of several of the Quest Software tools.  My favorites among them are Spotlight for SQL Server Enterprise and Lightspeed for SQL Server.  Spotlight gives a nice easy to view look at the entire enterprise, and Lightspeed has saved us a fortune on our monthly backup costs at our co-lo.

    What are the most important things a person can do to master Transact-SQL?

    I follow the “necessity is the mother of invention” technique, probably more aptly the “necessity is the mother of learning”.  I don’t have the ability to sit down and read a reference book.  If I don’t have a need to pick up the new skill for a project then I really don’t get into it.  I’ve found that the way that I learn new skills is to come up with a project and build it.  It may take me forever, and probably won’t ever get done, but I learn tons during the process.  I picked up T/SQL the same way.  I needed to get something done, so I grabbed the book and started thumbing through the index until I found what I was looking for.

    How much Transact-SQL does an admin really need to know?

    A DBA certainly doesn’t need to have as much knowledge of T/SQL as a database developer, but you should still know a good amount of T/SQL.  You are going to need to be able to read the T/SQL which the developers give you to roll into production, as well as be able to performance tune that T/SQL.  Knowing the UI is good, but you won’t be able to get into the really senior level positions without a good amount of T/SQL knowledge.

    You were managing one of the largest SQL Server installations in the world, supporting more than 175 million users. I know when you go from million row tables to billion row tables it is a whole new ballgame, What kind of mindset shift is needed to be able to support such a large SQL Server installation?

    When dealing with tables that large in an OLTP environment you have to take a new approach to data management.  Normally when we look at deleting data we want to delete all the data in a single transaction.  When dealing with tables this large and users hitting the tables every second, you can’t afford to lock the table for an hour while you delete the data.  Because of this you have to look into doing all your data operations in small batches of a few thousand at a time.  While this takes a lot longer to complete, your locks are held for only a split second so the users never know that the operation is happening.  With data growth rates as high as MySpace’s installation we were constantly battling against the fact that we were growing the data faster than the SQL Server could process the statistics automatically.

    What are some of the biggest SQL Server setup mistakes that people make besides putting all the files on one drive?

    There are lots of mistakes which are very easy to make.  Probably the biggest that I see is that people do not correctly align their disk volumes.  This is when the boot sector is left at 32k causing the disks to do twice as much work as they need to.  Another big mistake that I’ve seen is not having enough RAM in the SQL Server.  These days RAM is not that expensive (until you want to put over 64 Gigs in a server), there just isn’t an excuse for not having enough RAM.

    What SQL Server books are on your bookshelf?

    My book shelf is surprisingly empty.  I have a bunch of SQL 2000 books sitting around, and a few SQL 2005 MS Press books that I was given when the SQL 2005 Exams hit the street.  I can’t remember the last time I actually opened them was though.  These days my book shelf is

    Where can we expect to see you in 2008/2009? Any conferences, seminars or trade shows perhaps?

    I’ve got a busy schedule for the rest of 2008.

    On October 16th I’ll be speaking to the Orange County, CA SQL Server Users Group.

    On October 22nd I’ll be part of a webcast for Quest Software’s Quest Connect 2008 web conference.

    On October 25 and 26 I’ll be speaking at the SoCal Code Camp (it’s the weekend before and right next door to PDC, so just fly in a couple of days early).

    On November 17th and 18th I’ll be attending the Quest SQL Server Customer Advisory Board Summit where I’ll be speaking with the Quest Software Customers and other Advisory Board members.

    On November 19th through the 21st I’ll be attending the PASS summit. I’m not speaking this year, but I’ll be happy to say hello to anyone who walks up to me.

    The last thing I have scheduled for the year is the .NET users group in Riverside County, CA has asked me to come and speak to them.  They are as far as I know about the only users group near my house, so I simply couldn’t turn them down.  I’ll be speaking to them on December 9th.

    So far my schedule for 2009 is pretty light.  I’ve got the MVP summit on my schedule, as well as EMC World.  Once PASS schedules the 2009 summit that’ll be there as well.  I know that we will have a couple of SoCal Code Camps next year (we had three this year), so I’ll be speaking at them again.  And of course I’ll be speaking at any user groups that’ll have me.



    Thanks to Denny and if you want to know what Denny is up to with SQL Server then check out his blog here:

  • The code name for the next version of SQL Server is Kilimanjaro

    After Hydra,Sphinx, Shiloh (32 bit) and Liberty (64 bit), Yukon and Katmai we finally got to codename Kilimanjaro. So Kilimanjaro will be the codename for SQL Server 11


    Nothing more to say here  :-(

    [Edit]Apparently there is more to say here because Kilimanjaro is the code name for the next version of SQL Server but this is not SQL 11  :-(


    Here is what it says in the press release

    Ted Kummert, corporate vice president of Microsoft’s Data and Platform Storage Division, showcased “Kilimanjaro,” which will further enrich SQL Server’s BI capabilities while providing a robust and scalable data platform capable of supporting the largest BI deployments. “Kilimanjaro” will include a set of new, easy-to-use analysis tools for managed self-service, project-code-named “Gemini,” that will enable information workers to slice and dice data and create their own BI applications and assets to share and collaborate on from within the familiar, everyday Microsoft Office productivity tools they already use. Customers and partners will be able to gain early access to “Kilimanjaro” within the next 12 months via a community technology preview (CTP) with full product availability slated for the first half of calendar year 2010. [/Edit] 


  • Converting IP Addresses Between Bigint and Varchar And How to Sort IP Addresses

    I was wondering yesterday what else to add to the SQL Server Programming Hacks, I used twitter and asked for some inpiration and K. Brian Kelley responed with Some systems store IPs as large integers. Converting back and forth would be cool.

    Interesting, I always use 4 tinyints or plain vanilla varchar(15) to store IP addresses. The nice thing when storing IP addresses in tinyint columns is that you don't have to check if the IP address is valid, a tinyint can only hold values between 0 and 255

    Before we start with code let us take a sample IP address, does look familiar? Yes that is your local IP address.

    Here it is in decimal and binary
    127 0 0 1
    01111111 00000000 00000000 00000001

    Now to convert, you would take the first value,
    add the second value + 256
    add the third value + (256 * 256) = 65536
    add the fourth value + (256 * 256 * 256) =16777216

    So in our case the select would be


    1 +

    0 * 256 +

    0 * 65536 +

    127 * 16777216

    which is 2130706433

    So to convert from IP Adress to integer is very simple, you use PARSENAME to split it up and do the math. Here is the function.

    CREATE FUNCTION dbo.IPAddressToInteger (@IP AS varchar(15))

    RETURNS bigint




    CONVERT(bigint, PARSENAME(@IP,2)) * 256 +

    CONVERT(bigint, PARSENAME(@IP,3)) * 65536 +

    CONVERT(bigint, PARSENAME(@IP,4)) * 16777216)



    But how do you get out of 2130706433?
    It is the reversed of what we did before (surprise) so instead of multiplying we will be dividing
    Here is the funcion


    CREATE FUNCTION dbo.IntegerToIPAddress (@IP AS bigint)

    RETURNS varchar(15)



    DECLARE @Octet1 tinyint

    DECLARE @Octet2 tinyint

    DECLARE @Octet3 tinyint

    DECLARE @Octet4 tinyint

    DECLARE @RestOfIP bigint

    SET @Octet1 = @IP / 16777216

    SET @RestOfIP = @IP - (@Octet1 * 16777216)

    SET @Octet2 = @RestOfIP / 65536

    SET @RestOfIP = @RestOfIP - (@Octet2 * 65536)

    SET @Octet3 = @RestOfIP / 256

    SET @Octet4 = @RestOfIP - (@Octet3 * 256)

    RETURN(CONVERT(varchar, @Octet1) + '.' +

    CONVERT(varchar, @Octet2) + '.' +

    CONVERT(varchar, @Octet3) + '.' +

    CONVERT(varchar, @Octet4))


    Now let's try this out, first run this

    SELECT dbo.IPAddressToInteger('')

    That returns 2130706433
    Now run this

    SELECT dbo.IntegerToIPAddress(2130706433)

    That returns


    Sorting IP Addresses when they are stored as varchar
    When you store IP Addresses in a varchar column and someone asks you to sort them it is not that obvious how to do this.
    Execute the code below


    CREATE TABLE #IpAddresses(IP VARCHAR(15))

    INSERT #IpAddresses VALUES('')

    INSERT #IpAddresses VALUES('')

    INSERT #IpAddresses VALUES('')

    INSERT #IpAddresses VALUES('')

    INSERT #IpAddresses VALUES('')

    INSERT #IpAddresses VALUES('')

    INSERT #IpAddresses VALUES('')

    INSERT #IpAddresses VALUES('')

    INSERT #IpAddresses VALUES('')

    Now sort this data

    SELECT * FROM #IpAddresses






    As you can see, it is not correct, this is because the data is still varchar, you need to convert to integer to 'fix' this

    Run this code

    SELECT * FROM #IpAddresses






    Thanks to K. Brian Kelley for the inspiration and T-SQL for this post, you can also check to see some of his books and articles

  • Run Microsoft Windows Server And SQL Server on Amazon EC2

    Amazon made an annoucement today that you will be able to run Microsoft Windows Server And SQL Server on Amazon EC2.

    Starting later this Fall, Amazon Elastic Compute Cloud (Amazon EC2) will offer you the ability to run Microsoft Windows Server or Microsoft SQL Server. Today, you can choose from a variety of Unix-based operating systems, and soon you will be able to configure your instances to run the Windows Server operating system. In addition, you will be able to use SQL Server as another option within Amazon EC2 for running relational databases.

    Amazon EC2 running Windows Server or SQL Server provides an ideal environment for deploying ASP.NET web sites, high performance computing clusters, media transcoding solutions, and many other Windows-based applications. By choosing Amazon EC2 as the deployment environment for your Windows-based applications, you will be able to take advantage of Amazon’s proven scalability and reliability, as well as the cost-effective, pay-as-you-go pricing model offered by Amazon Web Services.


     More info here:

  • SQL Server to scale into hundreds of terabytes of data by using DATAllegro

    Microsoft just released an interesting press release:  Microsoft Closes Acquisition of DATAllegro

    The important stuff from this press release:

    Microsoft will offer a new solution based on DATAllegro’s technology that extends Microsoft SQL Server to scale into hundreds of terabytes of data. The company will begin giving customers and partners early access to the combined solution through community technology previews (CTPs) within the next 12 months, with full product availability scheduled for the first half of calendar year 2010.

    Some info from their website: DATAllegro v3 performs table scans between 0.5TB/minute and 10.5TB/minute. DATAllegro v3 offers new capabilities to handle complex workloads that are a mixture of near real-time loads, long analytical queries and short quick-hit queries. Improved workload management gives queries with low workloads higher priority, reducing the overall workload on the server. DATAllegro also provides a comprehensive architecture with a high-speed loader performing loads at rate of over 1TB/hour. The loading server or landing zone within the high-speed appliance network is used to provide a loading services with a minimal impact to queries being run.

    You can read more about DATAllegro here

  • SQL Server 2008 Express with Advanced Services And Tools Now Available For Download

    I just noticed that there are 3 versions of SQL Server 2008 Express available for download


  • SQL Server 2008 Express
    • SQL Server database engine - create, store, update and retrieve your data
  • SQL Server 2008 Express with Tools
    • SQL Server database engine - create, store, update and retrieve your data
    • SQL Server Management Studio Basic - visual database management tool for creating, editing and managing databases
  • SQL Server 2008 Express with Advanced Services
    • SQL Server database engine - create, store, update and retrieve your data
    • SQL Server Management Studio Basic - visual database management tool for creating, editing and managing databases
    • Full-text Search - powerful, high-speed engine for searching text-intensive data
    • Reporting Services - integrated report creation and design environment to create reports

    Get it all here

    If you register your copy you can get a free eBook and other goodies

  • Interesting Bug/Feature In SQL Server 2008 RTM

    Someone had a problem with 8 year old procs which started to fail after moving to SQL Server 2008
    Of course he should have used ints, but let's see what happens

    Run this code on SQL Server 2005 and 2000

    DECLARE @num_Passed Numeric(2, 0);

    SET @num_Passed = -1;

    SELECT @num_Passed


    IF (@num_Passed = 0)

    PRINT 'True';


    No problem right?

    Run just this part on SQL 2008

    DECLARE @num_Passed Numeric(2, 0);

    SET @num_Passed = -1;

    SELECT @num_Passed

    No problem either
    Now run this whole thing

    DECLARE @num_Passed Numeric(2, 0);

    SET @num_Passed = -1;

    SELECT @num_Passed


    IF (@num_Passed = 0)

    PRINT 'True';

    Oops, this is what we get
    Server: Msg 8115, Level 16, State 2, Line 7
    Arithmetic overflow error converting expression to data type tinyint.

    Change the -1 to 1

    DECLARE @num_Passed Numeric(2, 0);

    SET @num_Passed = 1;

    SELECT @num_Passed


    IF (@num_Passed = 0)

    PRINT 'True';

    No problem either.

    Run this

    IF (convert(Numeric(2, 0),-1) = 0)

    PRINT 'True';

    That fails
    Let's make it numeric(3,0)

    IF (convert(Numeric(3, 0),-1) = 0)

    PRINT 'True';

    No problem, that runs fine. So is this a bug because of implicit conversion to tinyint which can't hold negative values?

  • SQL Server 2008 Management Studio tip: Status Bar Custom Colors

    Joni Moilanen has a very nice tip about how you can use custom colors in the status bar so that you don't truncate that production table yet again :-)

    Check it out here:

  • SQL Tip, Compiling Your SQL Without Running It To See If It Would Run

    Let's say you have a big SQL script with a ton of code and you want to make sure it runs but you don't want to execute it because it updates tables, deletes data etc etc.
    Take this simple example



    SELECT 1/asasasas


    You can probably guess that the second statement is not valid, when you have a lot of code it is more difficult to spot these things.
    Execute the code above and you will get this

    (1 row(s) affected)

    Server: Msg 207, Level 16, State 1, Line 1
    Invalid column name 'asasasas'.

    SQL Server has the SET NOEXEC statement. From BOL:

    When SET NOEXEC is ON, Microsoft® SQL Server™ compiles each batch of Transact-SQL statements but does not execute them. When SET NOEXEC is OFF, all batches are executed after compilation.

    The execution of statements in SQL Server consists of two phases: compilation and execution. This setting is useful for having SQL Server validate the syntax and object names in Transact-SQL code when executing. It is also useful for debugging statements that would usually be part of a larger batch of statements.

    The setting of SET NOEXEC is set at execute or run time and not at parse time.

    So execute the code below





    SELECT 1/asasasas





    As you can see the output is the following:
    Server: Msg 207, Level 16, State 1, Line 1
    Invalid column name 'asasasas'.

    You never see the getdate. Parsing that code will not throw an error and because of deferred name resolution you can fat-finger table names and it will parse without a problem. Using NOEXEC is a nice way of seeing if your code would run. The question is, of course, whether you need to do this at all since we all have a QA server <g>


  • Microsoft SQL Server 2008 Feature Pack, August 2008 Available for download

    The 2008 Feature Pack for Microsoft SQL Server 2008, a collection of stand-alone install packages that provide additional value for SQL Server 2008 is available for download. One of these components is Microsoft SQL Server 2008 Policies

      Microsoft SQL Server 2008 Policies are examples of how you can take advantage of Policy Based Management. These policies will help you follow some of the SQL Server best practices and avoid common pitfalls. For more information, please see Administering Servers by Using Policy Based Management in SQL Server 2008 Books Online.

    Download it here:

This Blog


Privacy Statement