THE SQL Server Blog Spot on the Web

Welcome to - The SQL Server blog spot on the web Sign in | |
in Search

Davide Mauri

A place for my thoughts and experiences the Microsoft Data Platform

Viewing how much memory is used by not reused query plan

One interesting observation that may help developers get convinced that they should parametrize query and that they must check that the ORM they use does it correctly is show how much memory can be wasted by plans that cannot be effectively reused.

The following query can help on this:

with cte as (
        reused = case when usecounts > 1 then 'reused_plan_mb' else 'not_reused_plan_mb' end,
), cte2 as
        size_in_mb = sum(size_in_bytes / 1024. / 1024.)
    group by
        reused, cacheobjtype, objtype
), cte3 as
        cte2 c
        ( sum(size_in_mb) for reused in ([reused_plan_mb], [not_reused_plan_mb])) p
    objtype, cacheobjtype, [reused_plan_mb] = sum([reused_plan_mb]), [not_reused_plan_mb] = sum([not_reused_plan_mb])
group by
    objtype, cacheobjtype
with rollup
    (objtype is null and cacheobjtype is null) or (objtype is not null and cacheobjtype is not null)

The result is something like this:


as you can see this server (SQL Server 2005 SP2) is using near 1.8GB of memory for Plan Caching and one third is memory that contains plans that are never reused (column usecount = 1 as reported by DMV sys.dm_exec_cached_plans)

Monitoring the result of this query can also show how much impact the usage of the new (SQL Server 2008) “Optimize for ad-hoc workload” option can have on a system.


I’ve also updated my sys2dmv project on CodePlex with this new script:

Published Friday, July 23, 2010 11:48 AM by Davide Mauri
Filed under: , , ,

Comment Notification

If you would like to receive an email when updates are made to this post, please register here

Subscribe to this post's comments using RSS



David Dubois said:

Thanks Davide, ... Is that gigabytes or megabytes?

July 24, 2010 6:09 AM

Uri Dimant said:

July 25, 2010 2:56 AM

Uri Dimant said:

July 25, 2010 2:56 AM

Davide Mauri said:

@David Dubois



Thanks URI I wasn't aware of Kimeberly's article. Thanks!

July 25, 2010 5:14 AM

AaronBertrand said:

On my system where we use a lot of ad hoc sql due to requirements, I've switched to optimize for ad hoc workloads and parameterization forced.  I have 4.2 GB reused and 395 MB not reused; not as bad as I expected.  I imagine it was swinging the other way a year ago.

July 26, 2010 9:57 AM

Davide Mauri said:


Interesting observations! 4.2Gb of reused plan sounds pretty good! How much memory has your server?

July 26, 2010 12:18 PM

AaronBertrand said:

This server is x64, 16 cores with 32 GB RAM.  Memory is pegged but CPU hovers around 10%.

July 26, 2010 1:06 PM

TheSQLGuru said:

I have a client with REALLY bad old ADOc looping code and have a job set up that runs every 30 minutes to flush the plan cache to avoid serious bloating (still on early version of SQL 2005 so plan cache can take up a very large fraction of RAM).  

August 2, 2010 11:55 AM

Davide Mauri said:

!?!?!?!?!? Arght!!!!!!!! They should switch to SQL 2008 immediatly (or try to fix the application, but I guess the migrating to 2008 is less expensive :))

August 2, 2010 12:11 PM

Leave a Comment


About Davide Mauri

Director of Software Development & Cloud Infrastructure @ Sensoria, an innovative smart garments and wearable company. After more than 15 year playing with the Microsoft Data Platform, with a specific focus on High Performance databases, Business Intelligence, Data Science and Data Architectures, he's now applying all his skills to IoT, defining architectures to crunch numbers, create nice user experiences and provide meaningful insights, all leveraging Microsoft Azure cloud. MVP on Data Platform since 2006 he has a very strong background development and love both the ER model and OO principles. He is also a fan of Agile Methodology and Automation, which he tries to apply everywhere he can, to make sure that "people think, machines do".

This Blog


Privacy Statement