THE SQL Server Blog Spot on the Web

Welcome to - The SQL Server blog spot on the web Sign in | |
in Search

Linchi Shea

Checking out SQL Server via empirical data points

Would you optimize SQL for less performance?

What do you mean? Okay, that does sound like an oxymoron, doesn't it?

Let's say you are trying to optimize a stored procedure, and your proposed change results in the stored procedure running not faster, but perhaps a bit slower than it currently does. Can you then declare your optimization effort a success?

Just some food (well a question) for thought.

Published Friday, January 08, 2010 9:56 AM by Linchi Shea



AaronBertrand said:


How exactly are you measuring your optimization?  Are you saying that you re-wrote the stored procedure in such a way that it *should* be faster (e.g. getting rid of a cursor, eliminating joins) or for less tangible reasons (e.g. making the code tidier, getting rid of read uncommitted)?

January 8, 2010 9:08 AM

Linchi Shea said:

Hi Aaron;

It's totally open ended. But if it's just to make the code more readable, tidier, etc, that is not really SQL optimization, is it?

This is really just a question thrown out there to provoke people to think a bit about what optimization may mean. obviously, I have some opinions on this that I'll put together in a post later.

January 8, 2010 9:21 AM

Stephen Colbert said:

Hey man,

I'd say that what you're describing is refactoring and IMO refactoring is a fine thing to aspire to, but VASTLY lower in importance than performance.  I'd make any refactoring initiative require the same or better performance metrics as a result of acceptance.


January 8, 2010 9:30 AM

Alexander Kuznetsov said:


it depends on the requirements. If I need to ensure 0.5 sec response time, and my query runs 0.01 sec most of the time, but has a risk of running in 2 sec once a week, then my query does not meet the requirements. If I change it and it always runs 0.4 sec with no chance of spikes, than it meets the requirements, although to optimize I had to essentially circumvent the optimizer.

January 8, 2010 9:38 AM

Adam Machanic said:

How about: Your server has disk I/O issues and you "optimize" the query to use less disk I/O, but the result is a bit slower in duration on an average run. Or the same for CPU? IMO, those are totally valid "optimizations", and on a few (rare) occasions I've had to do them.

January 8, 2010 9:43 AM

Denis Gobo said:

yes, definitely! Maybe it is a proc that runs once a day or after hours. Or maybe it is a proc that runs on the last business day of the quarter (when a lot of other jobs also tend to start running to calculate QTD performance)

1) You made it run a little slower but it doesn't peg the CPU anymore and slows everybody else and all the other processes down

January 8, 2010 9:53 AM

noeldr said:

My place is in a highly concurrent environment. I have in several occasions tune queries for less CPU usage and slower throughput. Basically many processes in parallel could have more CPU time. Same goes when MAXDOP is not constrained with some highly concurrent loads ... restricting queries with MAXDOP has saved the game at times.

January 8, 2010 9:55 AM

dan said:


Obvious examples are where locking can occur with multiple calls, or where performance will suffer in the future with more data.

ie improve future concurrency and scaleability at the cost of performance **under current conditions**

January 8, 2010 10:01 AM

Jonathan Kehayias said:

Yeah I am with Adam, and in my current job I have more than once tuned a query for IO reduction at the cost of execution time, but reducing IO was more important than gaining a few seconds duration.

January 8, 2010 10:12 AM

Gary said:

What is optimization?  If  optimization is purely elapsed time at the client, then no you are not done.  If optimziation is system resources, then perhaps you are done if it uses less resources.

Why are you optimizing?  Just because it is good practice? To solve a specific problem?  Because the boss said to do it?

Questions alwasys come back to requirements.  Your question cannot be answered without discussing requirements.

January 8, 2010 10:15 AM

Glenn Berry said:

It depends on what you are trying to accomplish and what the biggest issues are for your system. If your biggest issue is I/O, you might do something that makes a query slower in elapsed time, but reduces the I/O cost.

January 8, 2010 11:26 AM

Quentin said:

Your post is rather poorly phrased.

Your subject asks us "would you optimize SQL for less performance", which implies that less performance was the *goal* of the optimization.  So, at the outset, you are asking if anyone ever *tries* to optimize for *less* performance.  This is a contradiction in terms, since almost no one uses the term "optimize" to mean "perform worse".  

In the content of your post, you then say that we are "trying to optimize a stored procedure", but that it ends up running slower after the "optimization".  Since "optimize" is usually used to indicate some effort at improvement, it can only be assumed that the goal of the optimization was NOT speed.  However, most people think of query optimization to imply a speed improvement, which makes the question somewhat confusing.

It seems a better way to pose your question would be this: "under what circumstances would you be willing to trade query speed for X" where "X" might be less memory usage, lower CPU utilization, etc.


January 8, 2010 11:36 AM

AaronBertrand said:

Linchi, I didn't mean tidier as in "readability" e.g. indenting, upper casing keywords, etc.  I meant tidier as in switching from a #temp table to a CTE, or switching from UPDATE / INSERT to MERGE, etc.  Not necessarily things you will know up front will make the code faster, but rather easier to maintain, *potentially* easier for the optimizer to work with, etc.

But I think several people make good points... it's very tough to really consider your question without knowing something about the actual requirements.  If you're just reviewing and modifying a stored procedure for fun, that is very different than fixing an explicit problem.

January 8, 2010 1:11 PM

Linchi Shea said:

The question is not meant to tricky or vague. Nor is it meant to be something of a precise nature with any pre-set answers in mind. It's just something to think about, including the issue of the requirements, the purpose, and so on.

January 8, 2010 1:25 PM

mjswart said:

My 2 cents.

I think everyone has picked up on the (intentional) vagueness of the question. And the implied question "Does faster always mean more optimal?" is the fascinating bit for me.

In my case, almost always. My criteria of "good enough" is very closely related to user-experience which means speed.

Taking a step back. I know it may sound obvious, but I've found it extremely helpful to define the problem or to define goals before undertaking anything. It's the first step in almost every formal strategy I can think of: (six sigma, ppdac, how-to-solve-it)

January 8, 2010 4:59 PM

Joe said:

I've actually "tuned" a proc to run slower because it was running too fast!  The proc was a polling service that ran constantly, it actually ran so often that the CPU usage jumped by 20%.  I had to slow it down so that it didn't put such a load on the server while still meeting the polling interval requirements.

January 9, 2010 7:25 PM

Mike C said:

Another aspect that's often overlooked is a tradeoff of slower/less performance in one component (e.g., a SQL proc) for a bigger performance gain overall in a complete process.  I've moved calculations from SSIS data flows to stored procs, and vice versa, to get better performance from my entire ETL process.

January 10, 2010 5:16 PM

mbourgon said:

I've done that before.  Queries that are going to be long running (because they are queries long date ranges) get, at the least, MAXDOP 1.  The users querying long ranges understand that it's not an immediate process, and it frees up IO for people running small queries who need fast results.  It's all in expectations.

March 29, 2010 9:48 AM

anita said:

hi Linchi Shea,

i have a question about sql bottleneck,i would be glad and pleased if you show me a suitable solution.

i am working for a consultant compony that they have tree sql server,and they asked me to solve this problem:

at the end of every month,the speed of loading forms and MIS decrease,becuse of calculating and counting the salaries that it uses sql server.i want to search about what is the exactly cause of it?which tools is better?(e.g. sqlsim,sql profiler,...)

i start my job a few months ago,so i dont have enough experiance,if you show me the begining of the way,i would be pleased you.

thx a lot


June 28, 2010 4:47 AM
New Comments to this post are disabled

About Linchi Shea

Checking out SQL Server via empirical data points

This Blog


Privacy Statement