THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

Jamie Thomson

This is the blog of Jamie Thomson, a freelance data mangler in London

Responses to SSDT questions

My recent article Get to Know SQL Server 2012's SQL Server Data Tools prompted two questions to come to me via email and in the interests of sharing knowledge via search engines I thought I would answer them here rather than by simply replying by email (I hate that so much useful information gets trapped inside closed inboxes).


Question: How would you handle cases below, our main database has many objects already, we are using redgate source control to keep things under sourcecontrol but ssdt would be awesome.
Our database has some invalid stored procs etc, unfortunately we do not want to drop them, underlying objects either don't exists or has changed etc
There are linked servers or code referencing other objects stored in the database
There are staging table named automatically in the database and we do not want to include them in the project and not drop them or touch them in a deploy

JT: The overall question here seems to be “how do we prevent SSDT from DROPping sprocs and staging tables that are in the target database but are not in the SSDT project?” Fortunately SSDT has an option within it to allow exactly what the questioner requires. When an SSDT project is published one has the option to set some properties relating to that publish. One of the settings is “DROP objects in target but not in project”:

SNAGHTML5751f10

If that option is left unchecked (which is the default) then no objects will get removed from the target database.


Question: At my previous shop, I done a proof of concept with one of our databases, but never got it implemented as VS2012 (with SSDT) had not been released yet. Well, fast forward 9 or so months, and at my new shop (which is extremely agile) I've been tasked with setting up our db projects with SSDT so for the immediate short term goals/objectives we can;

  • Allow new or existing developers to be able to easily get the latest databases from source control downloaded from TFS, built and deployed onto our local sandbox db environments. This ideally will be done via a powershell script or something of the sort.
  • Be able to employ C.I. so that every night a database deployment is done to our system test servers.

3 days into it, and I've wrestled with 3 of our databases, reverse engineering them into TFS and getting them built etc. My question is, what are the best practices or guidelines for;

  1. Allowing only Windows authentication for the local deploys to the sandbox environment. i.e. is it best to have a script with all our team user logins in a script? How would the project recognise a team member which will create a login for them so once the project is deployed they can login to their local db?
  2. Using a 1-touch/click action (preferably outside of VS2010) to get latest, build project & deploy to local sandbox
  3. Same as above but to a system test server

JT: There are quite a few questions there. Firstly I’ll say that its great to hear of sandboxed environments and Continuous Integration (CI) being used – I openly advocate these techniques in my blog post Implementing SQL Server solutions using Visual Studio 2010 Database Projects – a compendium of project experiences.

Now, taking each question in turn:

  1. Windows Auth for local deploys. I’m not sure if there’s any recognised best practise here but if developers are working on local sandboxes I would assume they would have sysadmin rights so I’m not sure there’s a need to have a script of team logins. Perhaps I have misunderstood the question in which case please clarify in the comments below. I will say this though – I think its better to have a Windows group that you can move all of your devs in and out of rather than having to modify a script each time a user joins your team.
  2. 1-click get latest, build & deploy. Scripting your build & deploy is something else I advocate in the aforementioned blog post:

    Building and deploying your datadude projects within Visual Studio can become a real time hog; in my experience its not unusual for deployments to take many minutes and your Visual Studio environment will be unavailable for further development work during that time. For that reason I recommend investing some time in writing some msbuild scripts that will build and deploy your project(s) from the command-line.

    I’ll go further here and say that the script with which you use to build and deploy to your sandbox should be the same script as which you use for your production environment and everything in between. The script should take a parameter value to name the environment and then deploy appropriately.

  3. See 2.

Hope that is useful. Any questions? Please put them in the comments.

@Jamiet

Published Friday, January 11, 2013 11:22 PM by jamiet

Comment Notification

If you would like to receive an email when updates are made to this post, please register here

Subscribe to this post's comments using RSS

Comments

 

Peter Schott said:

I'd probably argue that for the first question that they set those objects to "None" for build once they're in the project as well. That way they'll have the old code in version control, but can ensure that it's not being released to new servers. Any code pointing to it can also be deprecated because it won't work at all instead of partially working.

For the second person, your article at http://sqlblog.com/blogs/jamie_thomson/archive/2010/07/21/a-strategy-for-managing-security-for-different-environments-using-the-database-development-tools-in-visual-studio-2010.aspx could also be helpful. (And I've got another small tweak to the "Role" portion of the script that I'll post there shortly.) For our team, we need to test our app with certain SQL Logins at this time. Those need to be released to the local servers and then run the permissions appropriate to each environment. If that person is in a similar situation, using those post-deploy permission scripts is very helpful. Of course, having the devs be a sysadmin on their local box may be the best/easiest solution.

Totally agree that situation #2 should be automated. Use SQLPackage or MSBuild to get that going and don't forget to get the latest release code before publishing. :)

January 11, 2013 5:42 PM
 

Tyler said:

So, what exactly qualifies as an "object"?  It seems you're implying that Tables are classified as objects, what about stored procedures?  What about constraints?  What about indexes?  What about views?  What about logins?

July 25, 2014 1:09 PM
 

Peter Schott said:

@Tyler, to me anything in the project could be considered an object - tables, stored procs, constraints, views, logins. If there's a script/file for it, I was treating it generically as an "object".  Set those to "Not in Build" and you'll still have the script that defined it, but won't try to work with it when you publish. It sounds like you might have a bigger question around this, though.

July 25, 2014 1:27 PM

Leave a Comment

(required) 
(required) 
Submit

This Blog

Syndication

Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement