THE SQL Server Blog Spot on the Web

Welcome to - The SQL Server blog spot on the web Sign in | |
in Search

SQLBI - Marco Russo

SQLBI is a blog dedicated to building Business Intelligence solutions with SQL Server.
You can follow me on Twitter: @marcorus

Memory configuration (no pitfalls!) in SSAS

I recently had some problem with memory configuration of SQL Server Analysis Services 2005 and I'd like to share lessons I got. This post is a replacement for my previous post, that I striked to keep history of some bad assumptions I initially made.

SSAS have some setting that controls the way memory is managed by SSAS2005. Default values probably doesn't take care of concurrent applications and you could want to change them on a server running other services. Settings I'm talking about are defined as parameters of a single instance of SSAS: please note that MS suggest to not touch these settings...

Memory / LowMemoryLimit: it defines the value of physical memory that defines the point at which server is low on memory. Values greater than 100 are absolute values (scientific notation is supported in the config file, msmdsrv.ini). Values under 100 are calculated as a percentage of memory: reference value of memory (that corresponds to 100) is calculated as the minimum between the physical memory and the virtual memory available for a process. We have several cases (for semplicity I show the memory limit calculation for an hypotetic 100 value of the parameter): a 8Gb RAM 32bit Windows has a 2Gb limit (or 3Gb if you have /3Gb option enabled); a 8Gb RAM 64bit Windows has a 8Gb limit; a 1Gb RAM 32bit Windows has a 1Gb limit.

Default is 75 (75%) that is good most of the time, but if you have other processes on the same machine, you could prefer to lower this setting. If you want to know what the actual and running absolute limit is, you can get the exact running value reading the MSAS 2005 Memory / Memory Limit High KB performance counter.

Memory / TotalMemoryLimit: it's a percentage of physical memory that the server will use at maximum. Behaviour and setting considerations are analogous to LowMemoryLimi..

I had notice that in some cases PSS suggested to lower these settings to avoid an Out Of Memory error of SSAS 2005. After my previous wrong assumptions, Eric J of Microsoft helped me to understand what BOL still doesn't explain well. Eric wrote some other hints that we can share and index for future references.

Eric: You should assume they AS2000 and SSAS2005 are completely different products.  There are some similarities in settings but many differences.

Eric: Out of memory error could have been caused by many things, but basically memory could not be allocated from the OS.  Changing config values also impacts the point when the cleaner thread removes "shrinkable" memory.  Watch the perfmon counter "MSAS 2005:Memory\Cleaner Memory Shrinkable KB", and nonshrinkable.

Eric: To observe effects while you change LowMemoryLimit settings you can:

1. Open msmdsrv.ini in notepad, change value, and save file.
2. Open perfmon, report setting, view "MSAS 2005:Memory\Memory Limit High KB"
3. Observe the above listed value.

I already suggested to Eric that BOL could be improved on this topic (existing information here are not so clear).

Published Friday, February 17, 2006 7:58 PM by Marco Russo (SQLBI)

Comment Notification

If you would like to receive an email when updates are made to this post, please register here

Subscribe to this post's comments using RSS



SQLBI - Marco Russo said:

I recently discovered a few pitfalls on memory configuration of SQL Server Analysis Services 2005: the

September 28, 2009 12:16 PM

Keith Mescha said:

Marco, good clarification on these settings. They are very confusing and not finding much clarification.

On a seperate note I'm trying to find out if defragging the drive where data is stored for my cubes makes sense? Any experience with this?

November 18, 2009 11:30 AM

Marco Russo (SQLBI) said:

As usual, defragging is good but I never measured how it impacts performance on a SSAS machine. Consider that the amount of RAM that is used as cache strongly impact performance. Of course, writing on a fragmented drive will be slower than a non-fragmented one, thus processing times ar affected for sure. Query times - it depends on other factors.

However, use a defrag scheduling.

November 18, 2009 8:23 PM

Leave a Comment


About Marco Russo (SQLBI)

Marco Russo is a consultant, writer and trainer specialized in Business Intelligence with Microsoft technologies. He runs the SQLBI.COM website, which is dedicated to distribute resources useful for BI developers, like Integration Services components, Analysis Services models, tools, technical information and so on. Marco is certified as MCT, MCDBA, MCSD.NET, MCSA, MCSE+I.

This Blog



Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement