THE SQL Server Blog Spot on the Web

Welcome to - The SQL Server blog spot on the web Sign in | |
in Search

SQLBI - Marco Russo

SQLBI is a blog dedicated to building Business Intelligence solutions with SQL Server.
You can follow me on Twitter: @marcorus

The many-to-many revolution: paper finally released!

After a month of editing, I finally released my paper titled "The many-to-many revolution".

I copied in this post the introduction of the 84-page paper that is published (for free!) on a dedicated dimensional modeling page of SQLBI.EU web site.

I would like to receive feedbacks about the paper. You can use mail, comments or the support forum on SQLBI.EU.

The many-to-many revolution - Introduction

Analysis Services 2005 (SSAS 2005) introduced the capability to handle many-to-many relationships between dimensions. At a first glance, you may tend to underscore the importance of this feature: after all, Analysis Services 2000 and many other OLAP engines do not offer many-to-many relationships. Yet, its lack did not limit their adoption and, apparently, only a few businesses really require it. However, as this paper shows, the UDM (Unified Dimensional Model) can leverage many-to-many relationships helping you to present data from different perspectives that are not feasible with a traditional star schema. This opens a brand new world of opportunities that transcends the limits of traditional OLAP.

We will explore many different uses of many-to-many relationships that give us more choices to model effectively business needs, including:

  • Classical many-to-many
  • Cascading many-to-many
  • Survey
  • Distinct Count
  • Multiple Groups
  • Cross-Time
  • Transition Matrix
  • Multiple Hierarchies


Although you do not have to do so, I recommend you to read the models in the order presented above, because often each one builds upon the previous models.

Each model has a brief introduction, followed by a business scenario that may benefit of its use and an explanation of its implementation. Each model uses only the minimal set of dimensions that are necessary to explain the concept behind it and a small dataset that demonstrates the underlying behavior.

Only the Distinct Count scenario contains a section discussing the impact on performance. Since the considerations presented there may be applied to other many-to-many relationship uses, I recommend you read this scenario if you are interested in performance evaluations.

An important warning has to be made if you are going to use VisualTotals MDX function (directly or through an OLAP browser): visual totals apply only to one level at a time with many-to-many dimensions. In the Links section, you will find a link to a document written by Richard Tkachuk that explains this limitation.

Download the complete paper and demo here.

Published Sunday, September 24, 2006 6:56 PM by Marco Russo (SQLBI)

Comment Notification

If you would like to receive an email when updates are made to this post, please register here

Subscribe to this post's comments using RSS



ivan said:

Great paper! thanks very much.  there is one thing that I was hoping you could comment on.  

In the banking example, there is a simple bridge without start/end date columns.  This would suggest that the bridge only contains the most recent view of the relationships on the product.  Am I understanding this correctly?  

I assume that adding start and end dates into the bridge table would make it impossible to use in excel since the bridge would have to be filtered based on the date the user wants to consider.  

January 23, 2014 12:44 PM

Marco Russo (SQLBI) said:


you're correct. You can implement the view in different dates by creating a snapshot for each date (e.g. one per month) and slice data by using one of these dates in the bridge table (you might want to use a separate date table for that, with a default member set to the last date available).


January 23, 2014 12:49 PM

Ivan said:

Thanks Marco.

January 23, 2014 1:10 PM

LT said:

When combined with distinctcount and userelationship, M2M relationship takes very long time to run even though RAM and CPU usage remains low. When looking at the execution plan, the same Vertipaq query keeps repeating again and again with no end with slight difference in the "where" clause.

June 13, 2014 11:42 AM

Marco Russo (SQLBI) said:

LT - it depends, there are many possible optimizations based on how you write the DAX query and how you implement the data model. Moreover, consider that there are many improvement in latest builds of Analysis Services, also consider installing the latest cumulative updates and service packs.

From what you describe, you are getting the formula executed in the formula engine, which might be caused by a unoptimized DAX expression.

June 14, 2014 6:02 AM

LT said:

Marco, may be you are right. I need to relook into my DAX expression. I have 3 - 4 hierarchy tree tables which are all connected to the fact table through bridges. The hierarchy level is not defined and hence, the same cannot be flattened into a normal one (at least for the time being).

Based on the crossfilter i.e. whatever tree table is filtered, the DAX expression passes these bridge tables inside the calculate function.This works fine for active relationships. The problem starts when I need to insert USERELATIONSHIP function for other measures which are calculated on inactive relations along with all the other filters mentioned earlier

June 18, 2014 6:47 AM

Marco Russo (SQLBI) said:

If you create a model that is too complex, you might incur in a number of issues, most of them can be solved by simplifying the data model.

Moreover, consider that a completely different approach might be based on using more DAX and less relationships (and probably less tables).

Take a look at these two patterns:

June 18, 2014 7:06 AM

Leave a Comment


About Marco Russo (SQLBI)

Marco Russo is a consultant, writer and trainer specialized in Business Intelligence with Microsoft technologies. He runs the SQLBI.COM website, which is dedicated to distribute resources useful for BI developers, like Integration Services components, Analysis Services models, tools, technical information and so on. Marco is certified as MCT, MCDBA, MCSD.NET, MCSA, MCSE+I.

This Blog



Privacy Statement