THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

Jamie Thomson

This is the blog of Jamie Thomson, a freelance data mangler in London

Thoughts on Test Driven Database Development

Test-Driven Development (TDD) is a software development practise that has been around for a few years. Wikipedia describes it as:

Test-driven development (TDD) is a software development process that relies on the repetition of a very short development cycle: first the developer writes a failing automated test case that defines a desired improvement or new function, then produces code to pass that test and finally refactors the new code to acceptable standards. Kent Beck, who is credited with having developed or 'rediscovered' the technique, stated in 2003 that TDD encourages simple designs and inspires confidence.

http://en.wikipedia.org/wiki/Test-driven_development

Since 2003 TDD practises have seen refinements such as Behavior-Driven Development and Uncle Bob's Three Rules of TDD, all the while TDD has pretty much become an accepted way of developing quality software. Accepted that is everywhere outside of the database development arena and that is the arena in which I spend my working life. TDD simply has not, in my opinion, caught on with database developers like it has our appdev brethren and I was reminded of this yesterday when Atul Thakor asked on Twitter:

anyone done TDD for database development and would they recommend it?

https://twitter.com/#!/atulthakor/status/161886007929733120

To which my answer was an emphatic:

(1) yes & (2) absolutely, yes

https://twitter.com/#!/jamiet/status/161894215217987585

I'll use this blog post to expand on that outside of 140 characters.

 

In October 2010 I undertook a mini-project for the client I was working for at the time (a bank) where a colleague and I were tasked with building the database portion of a system that would support reconciliation of our ETL processes. It was a nice piece of work in that it was small, well-scoped, time-bound, greenfield, did not have any external dependancies and had a technically savvy product owner. We sat down at the start and decided that this was an ideal opportunity to trial TDD as a method of developing a database; I would write the failing tests and my colleague would make the tests pass. We came up with some guiding principles and, although we didn't know it at the time, they were pretty close to Uncle Bob's three rules.

I used Visual Studio 2010's database unit testing framework1 to write my tests and have them run as part of our Continuous Integration (CI) build (see Setting up database unit testing as part of a Continuous Integration build process). I would write the tests, check-in, the CI build would fail and my colleague would "get latest" in order to see what code he had to write to stop the build from failing. To cut a long story short the use of TDD was considered to be a great success; we shipped a working system on time/on budget and moreover, even though I didn't write a scrap of code that went into production I have never had more confidence that a system I was involved in building worked as intended. That's quite a statement. My confidence stemmed from the fact that as the test author I was ultimately responsible for ensuring that the system did what it was supposed to; I could qualify my confidence by pointing at our CI build and highlighting the number of tests that were passing and how that number had steadily increased as the project progressed.

By the time the project had finished the database consisted of (if memory serves me correctly) 6 tables and about 10 stored procedures or functions (so yes, very small). To test that we had roughly 70 tests that were getting run up to 20 times a day. The project had taken about two months from start-up to final delivery - you can make your own opinions as to whether you consider that prompt or tardy but our product owner was happy and that's pretty much all that counted as far as I was concerned.

Since that project I have moved onto other clients and at each one I have always extolled the use of database unit testing; we haven't always practised TDD but at each one we have been writing database unit tests and in the future I suspect that a client's willingness (or lack thereof) to use database unit testing will be a major factor in influencing whether we end up working together or not.

Are you a database developer doing database unit testing or perhaps even TDD? Let me know in the comments, I'd love to hear about others' experiences.

@jamiet

1Yes, that linked-to article from 7 years ago is the best one I could find to describe what Visual Studio's Database Unit testing Framework actually is - sort it out Microsoft!

UPDATE: I have just remembered that Jamie Laflen has written an excellent whitepaper entitled Apply Test-Driven Development to your Database Projects that goes into much more detail about how to achieve database TDD using Visual Studio than I have here. Well worth a read.

Published Thursday, January 26, 2012 9:30 AM by jamiet

Comment Notification

If you would like to receive an email when updates are made to this post, please register here

Subscribe to this post's comments using RSS

Comments

 

Kev Riley said:

Jamie,  it seems from reading that one of the factors critical to the success was that you wrote the tests, and someone else wrote the code.  That approach feels much better than when the same dev is responsible for both the tests and the code - has this been the case in other experiences with database unit testing?

January 26, 2012 5:23 AM
 

jamiet said:

Hi Kev,

Good question. I've used this setup (i.e. me writing the tests, someone else making them pass) once more since the above mentioned project and, again, I'm of the opinion that it was (by and large) a success.

I have written database unit tests on other projects since but that's not necassarily the same as doing TDD; on the two occasions where I *have* used TDD then a different person wrote the tests to the person that wrote the code. Based on that experience I would say that this setup works well but having never done TDD where I wrote both the tests *and* the code I can't say for sure whether one works better than the other. Ostensibly though, yes, I think its better for a different person to write the code than writes the tests.

Hope that answers your question.

JT

January 26, 2012 5:34 AM
 

Craig Ottley-Thistlethwaite said:

In the not too distant future I'll be creating Unit tests for databases so will be able to comment further once i've got my hands dirty. Thanks for posting the link to Jamie Laflen's whitepaper :o)

January 26, 2012 5:36 AM
 

jamiet said:

Craig, I'll be expecting a reply here in the future then :)

January 26, 2012 5:53 AM
 

Eric Wisdahl said:

Great article.  I've been following your Database Projects and Testing articles pretty closely as I'm hoping to implement something similar.  Of the few people I've talked to who have done it they almost all liked it; however, a few have questioned the ROI.  I personally think that the ROI would be high in the confidence factor for the database, but don't know if the business would believe so or not.  Thanks for posting.

January 26, 2012 8:51 AM
 

jamiet said:

Hi Eric,

Yes, convincing the business of the ROI is difficult.

-"We'd like to spend some time writing code to prevent mistakes creeping in in the future?"

-"What? You want me to pay you to write code that anticipates the fact that you're going to write bad code?"

Always a tricky one :)

To be honest it all depends on whether they've bought into agile or not and accepted that in the future, something WILL change. Once they get into that mindset then its a bit of an easier sell IMO.

I think the ROI is alot esaier to justify on longer term projects however I think my anecdote above proves that it can be beneficial on short term projects too.

JT

January 26, 2012 9:07 AM
 

Creighton said:

To justify ROI, you might point them to one of the many studies which highlights the cost of finding bugs in different stages of development.  Early = Less $$$

http://agileelements.wordpress.com/2008/04/22/cost-of-software-defects/

CK

January 26, 2012 3:13 PM
 

jamiet said:

Thanks Creighton, interesting stuff.

January 26, 2012 3:23 PM
 

Alexander Kuznetsov said:

Jamie,

I prefer BDD to TDD. It works out much better for me. Dan North is my coworker, which might also help a little bit. I have just written up a piece about my approach in database unit testing: http://www.simple-talk.com/sql/database-administration/close-those-loopholes-lessons-learned-from-unit-testing-t-sql/

My project is larger, several hundreds of tables, thousands of procedures and functions. Many of them are largely generated. We do not have full test coverage for trivial generated modules. In C# we would not have them at all - we would use generics.

However, more complex modules in our system have lots of tests.

January 26, 2012 10:19 PM
 

jamiet said:

Hi Alex,

Great article, thanks for that. Interesting to note that you opt to author the tests in C# - I've noticed many others doing similar in the past.

BDD I'm rather fuzzy on (even more so than TDD). I think I have some reading to do.

JT

January 27, 2012 4:03 AM
 

kalpanaceo said:

I like this post. this post very important. we can get lot of information thought this post and this site. thanks for giving these information, good luck...!!!!

Informatics Outsourcing - Database Development Service

February 2, 2012 1:58 AM
 

gareth said:

Hi Jamie,

Apologies as I realise you posted this nearly a year ago now but I'm interested in db unit testing and possibly TDD.

Before you wrote the first tests, did you:

(a) Just have an empty database, and your colleague create table(s) and proc(s) to pass the test?

(b) Already have the data model designed, including tables, views, DRI etc?

(c) Some other approach?

i.e. are the tests purely driving T-SQL logic, or database design, or both?

Thanks, and great post as ever, your blog is an excellent resource.

Gareth

November 20, 2012 10:50 AM
 

jamiet said:

Hi Gareth,

We started with a blank slate - CREATE DATABASE MyDB.

If memory serves it was me designing the data model but the crux of the solution was the code in the stored procedures, whch were written by my colleague.

The tests were essentially testing T-SQL logic but of course, those sprocs accessed tables so indirectly the tables were being tested too.

Testing "database design" (which you might think of as the sum of all the parts) is hard because you can't really say whether a database design has passed or failed. Its not a true or false, 0 or 1, type scenario, right? An overall design is an evolving beast.

Hope that helps

JT

November 20, 2012 11:14 AM
 

gareth said:

Thanks for your quick response Jamie.  Yep you're right there's no easy answer to testing db design, particularly if you're TDDing it I imagine

Just wondering if given an existing relational data model you've created, with a few complex relationships governed by constraints - would you look to write negative tests that attempt to violate those constraints?  

Or is that asking for too much of a test maintenance headache when the design evolves?

Also do you tend to write tests to check things like input parameter validation are working as expected?

Not sure I can easily TDD my current project as the requirements are still a little hazy (though I realise that's probably a good argument for not against!)

November 22, 2012 5:34 AM
 

jamiet said:

"Just wondering if given an existing relational data model you've created, with a few complex relationships governed by constraints - would you look to write negative tests that attempt to violate those constraints?"

Yes, I think I probably would. I want to make sure that bad data cannot get in.

Test maintenance is an issue - but always balanced against the cost of not having the tests in the first place - personally that's a trade-off I'm happy with. If you have your tests running as part of your CI build then you'll know pretty early on when they've broken and thus whether they are still valid or not.

"Also do you tend to write tests to check things like input parameter validation are working as expected?"

Yeah definitely. I'm a big fan of defensive programming (i.e. checking input params inside the sproc before actually doing anything with them) but I still think there's value in writing appropriate tests - in fact probably even more so, so you can check that your defensive programming code is doing what it expects.

Hope that's useful

JT

November 22, 2012 6:50 AM

Leave a Comment

(required) 
(required) 
Submit

This Blog

Syndication

Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement