THE SQL Server Blog Spot on the Web

Welcome to - The SQL Server blog spot on the web Sign in | |
in Search

Louis Davidson

Devlink Wrapup…or…Where I started to realize that the data world is “different”

First off, let’s just say that Devlink ( is a fantastic conference. It returned to the Lipscomb campus this year, and that was a great blessing.  MTSU last year was not as nice of a venue, and some of the way it had to be organized was not as nice as both years I have been at Lipscomb. Part of it is that it is a private college, but another part is that it is a smaller college.  The food that I ate was good, the seating was nice, and the layout was awesome.  The first day we had comedian Rik Roberts during lunch, and two years ago he was the party.  The party this year was a baseball game, with a small amount you had to pay extra.  I actually missed the party to go to a different gathering, but still it was a good idea for the party. Probably the only thing negative I could say about the conference was that it was harder to find a drink, and the refrigerators weren’t as cold as the barrels of ice from previous years. And if that is the worst thing I can think of, it was a great conference. 

Second, the sessions I attended ranged from great to excellent. Paul Nielsen and I kicked off the conference for the database track (after lots of last minute prep…worst misstep were a few slides that had black text on blue).  In the afternoon I attended Tommy Norman’s Scrum/XP in Team System Deep Dive session and learned more and more about Scrum and some things you can do with Team System (I learned more about Scrum and the Team System was kind of out of my league…Why can’t we use management studio yet for Team System connectivity?). After the first day of expanded sessions, I attended the Opening Keynote by Josh Holmes, a developer evangelist with Microsoft whose topic was whether or not software development has gotten too complex.  That was the first thing that stuck in my mind.

Over the next two days, I attended lots of sessions by Brad McGehee, Kevin Kline, Joe Webb, Jessica Moss, Craig Berntson and Wally McClure. All of the sessions were great, but one of them was the second thing that really stuck in my mind. In addition, saw Tim FordJeremiah Peschka, Robert Cain, Stephen Russell and many others.

Finally, we had a taping of DotNetRocks Live. Carl Franklin  and Richard Campbell led their show ( “live” in front of a studio audience. Again, complexity was the topic…

And that is when I decided that I had to do a bigger write up than what I had gotten to so far. Complexity? Complexity? Complexity?!? As a data architect/programmer, this started to bug me…  In the data driven design session, Craig Berntson had really started to get me cranked up by starting to imply that data wasn’t that important. Design “process” first.  His words sounded like every developer who I have know that wanted to build the database themselves. I think we eventually go to the point where he liked having strong data assistance, but one thing I think that was presented was this:


In my database design sessions, one of the most important things I try to get through to people is that you cannot have a screen that maps to a class that maps to a table perfectly.  If you do, your database will stink, or your UI will stink.  Your classes are also going to be a mess…But wait you say…this is exactly what we do.  It is what most people do because it takes less work.  He was coming from the class side of the equation, as most people design the data first, then everything else. I would note that this is not a good idea, even if you are by trade the data architect. First comes the processes and needs of the client, then the data and everything shakes out from there.

Of course, as a data guy, I want to put up a few layers to deal with this.  One a set of stored procedures that encapsulates all interaction, and then a solid layer of software that works with the stored procedures.  Many business rules (those without any override for sure) would reside minimally in the data structures, and again in the UI to make the user happy (we all need users to be happy), and any rules that aren’t naturally handled in the structure of the data or in simple constraints, moved out to the functional classes.

Many non-data programmers would prefer that all the database did was hold the data. Even things like UNIQUE constraints that fail whenever duplicated data is inserted are too much.

What I really find interesting is that while stored procedures have been reasonably steady for 15 years, I couldn’t keep up with the number of different languages that were bantered around in the 4 or 5 hours of non-data topics I heard over the week. Ruby, PHP, VB 6, VB.NET, C#, Ajax, Python, F#, ASP, AUGH! I mean, seriously. And these are just a few. We have T-SQL. Maybe a little VB or C# for a CLR object now and again, but basically T-SQL has been around forever and is just getting better over time.  Databases that we create in 4.21 would pretty much work now.  And the stored procedures we created could have been upgraded in minutes.

The problem is partially because it seems like it is really hard to change the database, but that is more about the fact that databases have state, unlike code.  Why do you think there is COBOL code still in use?  Because the code is so hard to replace?  No, because the code is strongly coupled with data, which is hard to change because it is so valuable.  But even if the data store could not be changed, if a proper encapsulation layer had been created, well, we might have a COBOL data access layer, but would we have a text terminals still in use? I doubt it.

So if we could just work together and fortify the database and an encapsulation layer, replacing the code wouldn’t be so big of a deal.  Look, I am all about getting things done fast.  But it should say a lot that as data architects we want to do MORE work. I mean, in the short run, just getting the data normalized is win enough.  And putting on CHECK constraints, UNIQUE constraints, and Triggers in some cases would be win enough (and very little work.)  But if we do the more work up front, then later we can save our data administrating brothers (and sisters!) from having to demand code changes to optimize a query.  Which we can’t  because, based on Josh Turner’s keynote, the average tenure of a programmer is about 18 months and no one seems to understand how the code works anymore.

So, if the definition of a great conference is to get people thinking… well, I guarantee you it was a great conference for me.

Published Monday, August 17, 2009 9:50 PM by drsql

Comment Notification

If you would like to receive an email when updates are made to this post, please register here

Subscribe to this post's comments using RSS



Craig Berntson said:

I think you missed the part where I talked about entities, not one class per table. And I never said anything about the developer designing their own tables. In fact, I advocated using a data guy. But my session was for developers, and they should be thinking about process first. Then use the data access layer and entities to link between the data and the domain layer.

August 17, 2009 11:35 PM

Jack Corbett said:

Nice post.  It's definitely hard to get developers to understand that a class and a table don't/shouldn't be a 1 to 1 relationship, especially with the explosion of ORM tools over the last few years.

August 18, 2009 8:07 AM

RichB said:

Great post.


August 18, 2009 9:39 AM

Mike Rankin said:

I was also bothered by the complexity lecture for two reasons.  

Like anyone else, I don't want to be the one holding up the deliverable.  For this, Josh's presentation brought out some guilt and sympathy.  

On the other hand, when a user asks for something seemingly simple, the "devil's always in the details".  Some systems are just complex and there's nothing that can be done about it.  For this, I felt frustration toward Josh's message.

My personal conclusion is to strive for simplicity when possible.  I've experienced situations where a collection of simple apps built over many years can become an unwieldy beast because no one bothered to consider the complex web these things can become.

So, simplicity without losing site of the "bigger picture"??

August 18, 2009 9:50 AM

drsql said:

Craig, I am sorry if it sounded like you were lumped in with those groups. Frankly I thought you had very good ideas and didn't go that way. Like I did say though, I think that by the end I was agreeing with you more and more.

Jack, Amen to the tools thing. Everywhere I talk/all devs I know want to ban the data layer...I wouldn't hate it so much if they didn't want to just get rid of it completely, but wanted to do it in their language.  But in the end, when a complicated query comes up, they will end up with a mess of code trying to make their tools work where as I can hide those details in a proc very easily and the functional code needn't even know about it.

Mike, Albert Einstein's quote about "Everything should be made as simple as possible, but not simpler" is such a good guiding principle. But you can't stop at possible. "Not simpler" is the key phrase. When complexity matches the reality of what is being modeled/implemented, happiness ensues.

And I definitely know what you mean, it is my problem with inexperienced requirement gatherers doing scrum/xp stuff. Look, it is all well and good to get somthing built, but if you miss the complexities that the user really wants, well, it is going to cost you a lot later.

I think the problem with his presentation (and a little bit about Craigs too in the beginning) is that they forget that their target audience is salivating over the idea of doing things without the complexities of teamwork. Take out the DBA and do that yourself. Make the user sit with you and see what you are doing rather than letting someone get requirements. Why does waterfall fail?  I think two reasons:

1. Things change faster than we can code them

2. Not enough skill levels in the early portions of the process

1 is why Scrum is a good idea, iterate on smaller chunks at a time, and 2 is why Scrum isn't the answer to everything. Frankly if more people were awesome at requirements and design, the later phases would have gone more smoothly. So we could have had 3 - 6 month phases of projects and get a lot more done...but coding is fun, and requirements gathering requires specialization (and lots of deep thinking), as does implementation design.

Some level of specialization is useful if you are going to be on a > 1 person team.  I know database design and T-SQL only, and even if I am one of the best in the world at it I might be useless on a scrum team since usually that is not going to take but a very small percentage of time in comparison to the programming because I use known patterns and practices and keep it simple.  In fact, I can model and implement all of the procs to keep any of the app code from needing to touch tables in so much less time than even their mapping tools could ever come close.  Why? Because T-SQL has been around for many years and the tools I have built still doggone freaking work version to version.  Every 3 or so years I update them to cover the newest stuff (like in 2005 I implemented error trapping), but even if I didn't, most of the code still works.

August 18, 2009 11:01 AM

Roberto Lopez said:

This is an interesting discussion having come from the database side and spent most of my first few years writing mostly T-SQL code. Now I am on the other side as a web and desktop software development. For most of my career I have designed my own databases or filled the role of database designer for another team. I have started using ORM tools and see the benefits that they provide. I think the problem arises when developers do not realize the impact of what they are writing in LINQ or nHibernate or your favorite ORM tool. Another problem is when developers try to use the ORM tool to write reports. Phil Japikse made a good point at one of our .Net User Group meetings in that an ORM tool is really best used for OLTP. I don't know if we will ever bridge the gap completely between the "data" guys and the "programmers". It's up to us to educate those around us.

August 18, 2009 12:46 PM

drsql said:

I can't disagree with you, but can you list the benefits?  And are they all benefits that are development based?  Dev time is such a small part of the lifecycle of a well built project and it seems to me that the shorter amount of time spent there is often a detriment to the future of the project.

The other thing to consider is the lifespan of the database is far greater than the lifespan of the app.  Like I said before, databases created in 7.0 or earlier are still out there running and a lot of times  the reason they are in old versions of SQL Server is that there is a brittle app attached to it.  Ask how many of those used stored procedures...nearly all give you this look like "yeah..we don't really know who uses upgrading is impossible...oh, and the app uses =* joins, so we are stuck."

August 18, 2009 2:05 PM

Roberto Lopez said:

The benefits that I have seen when using an ORM tool have been:

1. Removal of time consuming "plumbing" code for the persistence layer in the application.

2. Depending on what ORM tool you use, changes in the database are easily udpated in code.

Can't think of anything else right now but my view could be skewed as I normally design the database first and maybe most people are doing the reverse. There is also the problem of some developer writing something like "SELECT * FROM MillionRowTable". For that you could still use store procedures to expose your database and use that in your ORM tool. Which brings me to my one complain about ORM tools, stored procedure support seems to be an afterthought, they are getting better though.

August 18, 2009 2:29 PM

Bruce W Cassidy said:

From my own experience, I've found that a lot of the issues around dealing with data do need to be isolated from the application developers.

The comment about "even a unique constraint is too much" is spot on.  And you know what?  The developers are right.  That's the sort of thing that should be handled by the database code, not by the application code.

The solution I have found is for the database code to present an API that deals with object data.  So from the application point of view, it becomes an object repository.  I used typed XML to pass the data as a set back and forward between the application and the database.  In that way, the details of how the data is stored are completely isolated from the application definition.  Instead, everything is based around an agreed interface, captured within the XML schemas.

Another thing I found is that I need to think about where code is located based on the type of work it's doing.  For example, if I am processing a set of open invoices to close them and sum up totals and prepare them for printing, it makes sense to do that on the database as a set oriented operation.  Yet, many purists seem to think that all business logic should be in the middle tier.

Here, I ended up thinking about the different domains of the application.  Just as you wouldn't put front end verification code within the middle tier, nor would you put data set oriented code there.  It's not about containing all of the business logic in one place, because let's face it: none of those layers are useful in isolation.  So face the fact that we need the front end, the business/process logic "middle tier" and the database "data set" logic, and plan for it.

So I end up with all database code being encapsulated behind an interface, and data that is passed to the application is in the form of XML objects.  The complexities of the relational model are hidden from the middle tier application developer, just as the complexities of the object model are hidden from the database.  They communicate through a well defined interface.

It's all pretty much common sense so far as I'm concerned.  So why are we still struggling with this stuff?

August 23, 2009 8:32 PM

drsql said:

Bruce, the reason is that what you just said can easily be translated to "work, work, work, this is so not worth it, work, work" by the "wrong" ears.  I completely agree, except when you say:  

"The developers are right.  That's the sort of thing that should be handled by the database code, not by the application code."

I think you meant the developers are wrong :)

The problem I have seen is that devs don't want to take the time to do plumbing. They all want to be working in the "glamorous" technologies with catchy names. But we database folks are the epitome of the non-glamorous lifestyle. T-SQL is an ugly language. SQL itself is beautiful once you "get it", but really doesn't meld well with the functional practices that make C# work.

Of course, if you could bottle up your API creator and sell it to Microsoft (or someone who will keep it running for years) then I like the way it sounds.  

August 23, 2009 10:46 PM

Grant Fritchey said:

Excellent discussion. It's nice to see that others are struggling with these issues. Many of the local developers I support are getting on the "it's not a database, it's an information persistence layer" bandwagon. Their efforts are absolutely focused on speed of initial development, with little to no thought towards maintainability or the life of the "information" "persisted" in the "layer." The problem, as I see it, is that they're still storing data in a database. You can label it something else to make you feel better, but it doesn't eliminate the hard work of dealing with how the relational storage engine works. Unfortunately, that's what they're trying to do, eliminate that messy, difficult thing from the equation by changing the language, not the implementation. There's no evidence it's going to work well and plenty of evidence that they're digging a hole.

August 24, 2009 9:42 AM

Leave a Comment


This Blog


Links to my other sites

Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement