THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

SQLBI - Marco Russo

SQLBI is a blog dedicated to building Business Intelligence solutions with SQL Server.
You can follow me on Twitter: @marcorus

  • Power BI Desktop & Excel

    The August 2015 update of Power BI Desktop added two important features for existing Excel and Analysis Services users:

    In case you didn't try it before, Power BI Desktop can connect to Analysis Services Tabular (connection for Multidimensional will arrive later, but Microsoft is working on it). It is interesting to consider that Power BI Desktop sends DAX queries that are different for Analysis Services 2012/2014 and Analysis Services 2016. The latter has better performance, thanks to the many new DAX functions and other improvements in the query engine. Thus, especially in complex reports, consider a test using the latest available CTP of SQL Server 2016 (at the moment of writing the CTP2.3 is the latest available, but consider that new versions might be released every month).

    The other important news is that you can import in Power BI Desktop an existing data model created in Power Pivot for Excel. In reality, you import also Power Query scripts and Power View reports. I found some minor issue when I imported linked tables, but overall the experience I had is very good. After you import the data model, you can refresh it within Power BI. If you used Excel linked tables, you have a Power Query script that reads the same data from the original Excel files when you refresh the data model.

    The opposite is not possible, so you cannot import in Power Pivot for Excel a data model created in Power BI Desktop. Since a real pivot table is not present in Power BI today, it would be very useful being able to connect an Excel pivot table to an existing Power BI data model. If you like having this feature integrated and supported, please vote the Ability to connect Excel to Power BI Data Model and create Pivot/Charts suggestion in Power BI support web site.

    Now, as it is described in the proposal, there are two ways to obtain this feature:

    • Connect the pivot table to the model hosted on powerbi.com: this would be similar to the connection to a model hosted in SharePoint. I guess that the only existing barrier to implement this feature is the authentication, in fact such a feature is not available in SharePoint online, too. Of course, such a feature would be more than welcome.
    • Connect the pivot table to a local PBIX file: this is a completely different story, and it would part of the scenarios I described in the Power BI Designer API feature request a few months ago (with around 1,400 votes it is the fifth most requested feature). In this case, the implementation might be realized in two ways: by integrating the Power BI engine within Excel, or by connecting Excel to Power BI Desktop. The former is unlikely to happen, because Power Pivot for Excel is already the engine we are talking about and I think that the release cycle of the two products will be always different in order to enable this scenario. The latter i simpler, and actually is already possible and completely unsupported. It would be nice if Microsoft simply enable the support for it.

    At this point you might be curious about how to connect an Excel pivot table to Power BI Desktop. Well, let me start with an important note.

    DISCLAIMER: the following technique is completely unsupported and you should not rely on that for production use, and you should not provide this to end users that might rely on that for their job. Use it at your own risk and don't blame neither me nor Microsoft is something will not work as expected. I suggest to use this just to quickly test measures and models created in Power BI using a pivot table.

    Well, now if you want to experiment, this is the procedure:

    1. Open Power BI Desktop and load the data model you want to use
    2. Open DAX Studio and connect to Power BI Desktop model
    3. In the lower right corner of DAX Studio, you will find a string such as "localhost:99999", where 99999 is a number that is different every time you open a model in Power BI Desktop (the same model changes this number every time you open it). Remember this number
    4. Open Excel (2007, 2010, 2013 - you can use any version) and connect to Analysis Services (in Excel 2013 go in Data / Get External Data / From Other Sources / From Analysis Services), specifying the previous string "localhost:99999" as server name (using the right number instead of 99999) and using the Windows Authentication
    5. At this point you will see a strange name as database, and a cube named Model. Click Finish and enjoy your data using a pivot table, a pivot chart, or a power view report (why should you use the latter in this scenario I don't know...)

    I will save your time describing the problems you will have using this approach:

    • If you close the Power BI Desktop window, the connection will be lost and the pivot table will no longer respond to user input.
    • If you save an Excel file created with this connection, the next time you open it the connection should be updated, using the right server name with correct number (if you try to refresh the pivot table, you get an error and you can change the server name in a dialog box that appears).
    • This feature might be turned off by Microsoft at any moment (in any future update of Power BI Desktop).

    That said, I use this technique to test the correctness of measures in a Power BI Desktop data model, because the pivot table is faster than other available UI elements to navigate in data examining a large number of values. But I never thought for a second to provide such a way to navigate data to an end user. I would like this to be supported by Microsoft before doing so. Thus, if you think the same, vote for Microsoft supporting it.

  • Large Dimensions in SSAS Tabular #ssas #vertipaq

    After many years of helping several companies around the world creating small and large data models using SQL Server Analysis Services Tabular, I’ve seen a common performance issue that is underestimated at design time. The VertiPaq engine in SSAS Tabular is amazingly fast, you can have billion of rows in a table and query performance are incredible. However, in certain conditions queries made over tables with a few million rows are very slow. Why that?

    Sometime the problem is caused by DAX expressions that can be optimized. But if the problem is in the storage engine (something that you can measure easily with DAX Studio), then you might be in bigger troubles. However, if you are not materializing too much (and this is a topic for another post of for the Optimizing DAX course), chances are that you are paying the price of expensive relationships in your data model.

    The rule of thumb is very simple: a relationship using a column that has more than 10 million unique values will be likely slow (hopefully this will improve in future versions of Analysis Services – this information is correct for SSAS 2012/2014). You might observe slower performance already at 1 million unique values in the column defining the relationship. As a consequence, if you have a star schema and a large dimension, you have to consider some particular optimization (watch my session at Microsoft Ignite to get some hint about that).

    If you want to know more, read my article on SQLBI about the Costs of Relationships in DAX, with a more complete discussion of the problem and a few measures of the timings involved.

  • DAX Formatter now supports Power BI Desktop and Excel 2016 #dax #powerbi

    If you use DAX, you should try DAX Formatter. Now it supports all the new functions introduced in Power BI Desktop and in Excel 2016.

    There are more than 70 new functions, even if half of them corresponds to Excel functions with the same name (see the second group). DAX Formatter also supports the variable syntax available in the new DAX.

    These are the new “original” DAX functions:

    • ADDMISSINGITEMS
    • CALENDAR
    • CALENDARAUTO
    • CONCATENATEX
    • CROSSFILTER
    • CURRENTGROUP
    • DATEDIFF
    • EXACT
    • EXCEPT
    • GEOMEAN
    • GEOMEANX
    • GETIMAGE
    • GROUPBY
    • IGNORE
    • INTERSECT
    • ISONORAFTER
    • KEYWORDMATCH
    • MEDIAN
    • MEDIANX
    • NATURALINNERJOIN
    • NATURALLEFTOUTERJOIN
    • PERCENTILE.EXC
    • PERCENTILE.INC
    • PERCENTILEX.EXC
    • PERCENTILEX.INC
    • PRODUCT
    • PRODUCTX
    • ROLLUPADDISSUBTOTAL
    • ROLLUPISSUBTOTAL
    • SELECTCOLUMNS
    • SUBSTITUTEWITHINDEX
    • SUMMARIZECOLUMNS
    • UNION
    • XIRR
    • XNPV

    And this is the list of the functions identical to the Excel ones:

    • ACOS
    • ACOSH
    • ACOT
    • ACOTH
    • ASIN
    • ASINH
    • ATAN
    • ATANH
    • BETA.DIST
    • BETA.INV
    • CEILING
    • CHISQ.DIST
    • CHISQ.DIST.RT
    • CHISQ.INV
    • CHISQ.INV.RT
    • COMBIN
    • COMBINA
    • CONFIDENCE.NORM
    • CONFIDENCE.T
    • COS
    • COSH
    • COT
    • COTH
    • DEGREES
    • EVEN
    • EXPON.DIST
    • GCD
    • ISODD
    • ISEVEN
    • LCM
    • ODD
    • PERMUT
    • POISSON.DIST
    • RADIANS
    • SIN
    • SINH
    • SQRTPI
    • TAN
    • TANH
  • Zero Inbox

    This is a blog post completely unrelated to the technical content I’m used to cover. But I’ve been asked so many times how I do handle my mail that I thought having a blog post will save me time to explain. So, if you are not interested, wait for the next blog post, which will be about Business Intelligence again!

    First of all, I only use email. I’ve seen (and tried) several other technologies with their to-do list and workflow management. But the problem is that I work for many customers, with different standards, that it’s impossible to standardize on a single technology. The mantra today is to keep it simple, and my conclusion is to use only one system. So I use email, and only email.

    Now, the problem is that email includes communication with customers, colleagues, friends. But it also contains newsletters, alerts, reports. And I also receive digests from forums, blog posts, Facebook messages, yammer communications, SharePoint alerts, and so on. It seems crazy, but in this way I have to handle email properly and I cannot afford losing or forget it. The side effect is that email is the most reliable way to get something done by me. I send email to myself from mobile phone to remember stuff. But if you try to contact me by SMS, Twitter, Facebook, WhatsApp, or whatever else that does not forward me an email… well, sooner or later I will see that, but you might be out of luck. I receive an average of 150/200 emails every working day, with peaks of 300. I send an average of 30-40 mail every working day. It happens that I make something wrong and I lose one email. But this happens once a month, maybe less. It is a 99.97% reliability, and I can live with that. However, I can manage that thanks to methodology and tools.

    Methodology: I use zero inbox. The idea is simple: at the end of the day, your inbox is empty. I have to admit that this does not happen every day, but just because I want to keep some message in evidence regardless of everything else. There are a lot of example over the web about how to reach that, but the principle is simple: triage often, process immediately or defer, but keep inbox empty or relatively small.

    I’m addicted to Outlook and I use Office 365. It is very consistent and integrated. I tried Gmail with a personal account, but I never got it. I work with people who would never get rid of their Gmail inbox, whereas I’m in another area. Outlook allows me to define rules that work on the server. This is very important for certain messages (forum, mailing lists) that I don’t want to pollute my Inbox, because I will read them when it’s the right time during the day. No rush. Rules working server side are important when I check email from my mobile phone. However, I have to use the Outlook desktop, because I rely on a couple of add-ins that I absolutely need.

    First of all, I have to remove messages from Inbox once I processed them. I don’t delete them, I archive them in a relatively lean folder structure (less than 100 folders in a hierarchical structure). Archive is very important to quickly find stuff I need. However, moving messages quick is important. I use SimplyFile. It has an algorithm that predict the right folder, and when the first choice is not the right one, you can browse the list or search in available folder names. I archive 80% of messages with a single shortcut in the keyboard, and the other 20% with less than 5 clicks on the keyboard. No mouse involved. Important for productivity. It also archives messages I send, so in the folder of a customer I have both messages received and sent. Very useful. The only problem is that when I triage and/or reply from my mobile phone, I know that I will have to complete the archive process on the desktop. But I don’t like services that do a similar service only online, because I want to be able to triage email when I have no connection. And the latency of a bad connection is also another big issue, and I travel a lot. So if you have some suggestion for an alternative service, please don’t lose time describing some online-only service because I will never spend time trying it. I’m happy with Outlook, I want the same experience on a mobile device.

    Second, I have to defer mail that I cannot process immediately. Outlook has its own tools, but I prefer to use SnoozeIt. This tool simply moves a mail out from the inbox for a certain amount of time (that I can choose for every message). It could be one hour, one day, one week, one month. When it’s time, the message appears in the inbox again (marked as unread if you want). There are many other features (categorization, statistics, and so on) but I simply don’t care. I see the mail in the inbox when I supposed it would have been a good time. I am writing this blog post because a few months ago I had this idea, but I wouldn’t be able to find the time until I finished my last book about DAX. And finally that day arrived (well, you have to wait a few other weeks for the book because of final production processes, but the content is ready, now it is in the paging and proofreading stage).

    And that’s it.

    I have around 200 messages snoozed for a future date. This does not correspond to 200 tasks I delayed, most of them are tasks that I cannot do until a certain date, or just remainders to check whether a certain action has been done by someone else. Well, I have many tasks I delayed because I didn’t have time, but not 200!

    I have been using this technique since 2007 using SpeedFiler (no longer supported I think). I moved to SimplyFile in 2010 because SpeedFiler did not support Outlook 2010. I adopted SnoozeIt since first beta in 2014. It works very well for me. However, I’ve seen that it is not good for everyone. Depending on your habits, you might love or hate it. I’m not trying to convince anyone using this technique, I’m just writing my experience because I think it will save me time when someone will see my empty inbox asking how is it possible.

    DISCLAIMER: I regularly paid licenses of SpeedFiler, SimplyFile, and SnoozeIt I use. I do not receive any compensation by these companies and I will not get any fee for possible purchases made by blog readers. Feel free to add alternative products in the comments, provided it is your experience and not just advertising.

  • The ALLSELECTED function under the cover #dax #tabular #powerpivot #powerbi

    I and Alberto Ferrari recently completed the writing of The Definitive Guide to DAX, and we spent months to correctly describe the internals of evaluation context in this language. There are many details that make data model working with both DAX and MDX, and sometime there are behaviors that are not intuitive to understand.

    A function that seems to work like magic is ALLSELECTED, which is very useful when you create measures that will be used in Excel pivot tables. What is not obvious is that the DAX engine has to realize what the user is selecting on a pivot table that generates a query in MDX. In reality, there is no other communication between client and server other than the MDX query, and ALLSELECTED is not related to MDX, it is a DAX function!

    Alberto extracted from the book part of this description and published the Understanding ALLSELECTED article on SQLBI. You will see that the magic in this function is just a particular manipulation of the filter context, which keeps track of the iterated table in the filter context every time a context transition happens. Not clear enough? Well, the article explains this better!

  • VertiPaq Analyzer for Analysis Services #ssas #tabular #powerpivot #powerbi

    During the writing of The Definitive Guide to DAX I wanted a simple way to analyze the content and distribution of data compressed in the VertiPaq engine, used by Analysis Services Tabular, Power Pivot and Power BI models. I always relied on BISM Memory Report (thanks Kasper!), but when you focus on a single database there are a number of details available in other data management views (DMVs) other than the one used by BISM Memory Report.

    I created VertiPaq Analyzer, which is a Power Pivot data model that collects data by these other DMVs and shows them in pivot tables that provide you information about compression, size of data and related structures (such as relationships and hierarchies), and column selectivity (very important to understand how to optimize DAX queries.

    You can download the workbook here, and read the article that describes all the metrics used.

    DMV Size 04

  • The new Power BI Desktop is here #powerbi #dax

    After months of public preview, Microsoft today is releasing the Power BI service to general availability. The preview was really a beta that evolved in these months, and I personally liked the approach that MS developers have now. I did not read the “it’s by design” answer to the many comments and suggestions provided in support forums and community areas, I have seen a continuous commitment to help users and partners in creating analytical solutions.

    You can read the official announcements and many details in the Power BI blog. In this blog post, I want to focus on Power BI Desktop (formerly known as Power BI Designer) and then make a few considerations about the entire platform. Chris Webb wrote a good comment about the role of this tool in the Microsoft BI stack. I would like to add my personal point of view about that.

    Power BI Desktop is a tool that you can download for free. It includes the features of Power Query, Power Pivot, and Power View, so you can create an analytical solution that works locally without any licensing cost. Someone read this as a strange move from a company like Microsoft, but things are changing and going to the cloud is not a question of “if” but of “when”. Thus, you start using Power BI Desktop today for a local model that works only on a single desktop. Whenever you will be ready to access cloud features (including the ability to use Power BI mobile apps to navigate your data on your mobile device), you will upload it to Power BI service. Which is still for free for personal use with 1GB of data uploaded to the cloud. You start to pay only to use the Power BI Pro features, which include collaboration (sharing with other people), on-premises gateways, hourly scheduled refresh, and larger data capacity (10GB/user). The cost is 9.99$ user/month, which is a competitive price from my point of view. More details in the pricing page of Power BI site.

    However, the fact that Power BI Desktop is completely free is not what is more interesting to me. What is important is that I have seen what Microsoft did in the last 9 months, and if they will continue to keep the pace of new release and improvements, this tool will be something you will be ready to pay even on a desktop, because you will depend on it. The fact that it is free is just a welcome additional benefit. Let me list a few of the important facts I see here, not all of them are so highlighted in official announcements.

    • New in-memory engine: Power BI Desktop runs a local instance of the Analysis Services engine, which is the same engine that also runs Power Pivot for Excel (however, the engine here had 3-4 years of evolution since the version that runs current versions of Excel and Analysis Services). An important difference is that Power Pivot is an add-in that runs in-process, and its release is managed by the Office team. Consequence: slow update cycle, priority to compatibility with older versions, more complex tests. Since Power BI Desktop has an independent release cycle and runs locally in the user context, you will be able to update it more frequently. The engine we have now is the same that runs on Excel 2016, but in reality it already exposes features that are not part of Excel 2016, such as bidirectional cross-filter for relationships in the data model. The new engine has many performance improvements, but what I like more is the ability to get improvements with monthly updates, instead of waiting for service packs of a bigger beast like Office.
    • New DAX: all the DAX code you wrote is good and works, but there are new DAX functions and the syntax for DAX variables which are very important to simplify DAX code and improve performance.
    • New graphical engine: if you used Power View in the past, or you used Power BI Designer during the preview, be prepared. Power BI Desktop has a completely new graphical engine, based on D3.js, which is amazingly fast. All the graphical components created by Microsoft are also open-source (available on GitHub), and you can extend Power BI data visualizations by extending these classes. There is food for system integrator and ISVs here, but first of all the results in terms of user experience are very good. Maybe you will find some bug and some feature to improve, but the roadmap is clear and is very good. Power BI Desktop is not only a product, it is also a platform, it can become an entire ecosystem (adoption of Power BI service and Power BI Desktop will be important for that)
    • Complete design experience: if you used the Power BI Designer preview, forget the limitations you had. The Power BI Desktop released now has the ability to create DAX measures, calculated columns (seeing the preview of the result of your formula in a table-view similar to Power Pivot), relationships (you have the diagram view). You can edit longer DAX expressions in a decent window with a decent editor (only one formula at a time by now, I hope they will implement a sort of complete-script editor in the future).
    • Ready for hybrid solutions: if you have your data on-premises in Analysis Services Tabular, you can connect Power BI directly to the service, from both Power BI Desktop (which will be only a set of reports in that scenario, without a copy of the data in the PBIX file) and Power BI on the web (so the cloud service only renders the report, but data is not persisted on cloud servers and you can log and audit all the incoming queries to your on-premises server). As soon as the same feature will be available for Analysis Services Multidimensional (Microsoft already announced it, even if we still do not have a release date), the number of semantically-rich data models ready to be used in Power BI will grow exponentially in one day. Be ready for that.
    • Direct publishing on Power BI: you can publish your data model and reports from Power BI Desktop to the Power BI Service by clicking on a button in the Power BI Desktop user interface. The alternative is to upload the PBIX file to the server, not a big deal, but this integration will just make it easier and faster.
    • Integration with other vendors: a big news is that you can publish your report also on other servers. The Power BI Desktop has a Publish button that open a list of options (see Chris Webb's post here). Power BI service is the first, and the second one will be Pyramid Server. Pyramid Analytics is the first vendor that implements the ability to publish a Power BI Desktop file to their server, which runs on-premises. This integration is not available yet and it will be released in the next few months (read more details about announcement from Microsoft and Pyramid Analytics). My understanding is that, once available, it will be possible to publish reports you design with Power BI Desktop without any access to any cloud service. I am really curious to see all the details, because you can imagine how many questions I have.
    • Integration with other services: the number of connectors (especially for other cloud services) available increased at a very fast pace, I have seen a new connector every week in the last months, and there are more to come. Probably Power BI is already the easiest way to get data from other cloud services and publish dashboards. One example over all is Google Analytics: Power BI is so quick and simple to get the most common used information that I use it on regular basis now, and I access to Google Analytics only when I need particular details.

    As you see, this is a very technical and feature-oriented point of view. If you read between the lines, what is more important to me is that this is the foundation of a new ecosystem for business analytics. Involving partner in extending connectors, publishing, and visualizations is a key to achieve that goal. This was not here in previous versions of Power BI strongly integrated with Office 365. The connection with Office 365 is still here (you have more features available when you use One Drive for Business for navigating in Excel data models), but is not a precondition to start using the service.

    This is the real challenge. Creating the reference ecosystem for Business Analytics. If you look at the market from this point of view, I would say that nobody has a clear leadership today. There are strong players in certain sectors, but nobody controls an ecosystem here. The next 12-18 months will say if the bet is right.

  • Power BI Desktop is coming

    Microsoft is going to release the new Power BI service on Friday, July 24th. The number of new features is huge, but remember that this is just the beginning of a new wave of continuous updates, similarly to what we have seen for Power Query in the last months, just at a larger scale.

    I will cover in this blog some more details about the impact of Power BI on many different point of views. In the meantime, I collected a few useful links to get some more anticipations of what is coming:

    The title of this blog post is dedicated to Power BI Desktop, because I think we are going to see a first version of a complete environment to design data models, etl, and visualizations, completely detached from Excel. It will be also free, but I am more interested to the features that will be available. Having a distribution unrelated to Office will help those environments where Office update requires years...

    As a side note, this is a very important news for Corporate BI: Public Preview of Azure Data Catalog, still a young service but very promising.

    Waiting for more news at the end of this week... 

  • Passing parameters to DAX measures

    You cannot create functions in DAX, and this is a limitation to certain abstractions you might want to implement for complex models.

    I used a pattern that allows you to "pass an parameter" to a DAX measure, simulating the behavior of a function at least in certain conditions and with many limitations. In practice, you can write:

    [Discounted Amount] ( Par1[Value] = 0.20, Par2[Value] = 0 )

    Well I really don't like this syntax, in fact DAX Formatter translates it into:

    CALCULATE (
        [Discounted Amount],
        Par1[Value] = 0.20,
        Par2[Value] = 0.00
    )

    If at this point the trick does not seem pretty obvious... read the full article Parameters in DAX Measures on SQLBI! 

  • DAX Studio 2.2 released - tracing support for #powerpivot and much else! #dax

    Thanks to Darren Gosbell we have DAX Studio 2.2. In this new release:

    • Tracing (query plans and server timings) available for Power Pivot
    • Basic implementation of Intellisense
    • Connect to Multidimensional SSAS servers
    • Support for multiple Power BI Desginer instances
    • Highlight unnatural hierarchies (read here why this is so important for performance)

     A more complete description of the new features is available in the Darren's post.

    I think that this release is a revolution for Power Pivot users. Until now, you had to use Analysis Services to restore a Power Pivot model and then run your query using DAX Studio to analyze performance. Now you don't need anything else other than Excel. This is amazing.

    If you are wondering about how to use this feature, simply follow these steps:

    1. Create a pivot table that generates a performance issue
    2. Capture the MDX query using OLAP PivotTable Extensions using its "View PivotTable MDX" feature
    3. Open DAX Studio from the Excel AddIn ribbon
    4. Copy the MDX query in DAX Studio
    5. Enable Query Plans and Server Timings buttons
    6. Run the query

    That's it. At this point, you can improve your productivity by copying the code of your DAX measure at the beginning of the MDX query.

    For example, if you have this MDX query from your PivotTable (look at Sales Amount measure):

    SELECT
    { [Measures].[Sales Amount], [Measures].[Sales Rows] } DIMENSION PROPERTIES PARENT_UNIQUE_NAME
    , MEMBER_VALUE
    , HIERARCHY_UNIQUE_NAME ON COLUMNS
    , NON EMPTY Hierarchize (
     {
      DrilldownLevel (
       { [Date].[Calendar].[All] }
       ,
       ,
       , INCLUDE_CALC_MEMBERS
      )
     }
    ) DIMENSION PROPERTIES PARENT_UNIQUE_NAME
    , MEMBER_VALUE
    , HIERARCHY_UNIQUE_NAME ON ROWS
    FROM [Model] CELL PROPERTIES VALUE
    , FORMAT_STRING
    , LANGUAGE
    , BACK_COLOR
    , FORE_COLOR
    , FONT_FLAGS 

     You just have to add these lines *before* your MDX statement

    WITH MEASURE 'Sales'[Sales Amount] = SUMX ( Sales, Sales[Quantity] * Sales[Unit Price] )

    The table name (Sales in this case) should correspond to a table of your model, use the table where you defined the measure originally. Now your definition of Sales Amount overrides the one of the data model in this query and you can easily change the following DAX code of the measure definition and test the entire query again (maybe clearing the cache before) until you obtain a better version. Then, simply copy the code & past it into your Power Pivot model, replacing the previous definition of the same measure.

    You will see that this is way more productive than changing the code in Power Pivot and refreshing the pivot table every time! 

     

  • SQL Saturday #454 and Expo 2015 - fill the survey and submit sessions! #sqlsatexpo

    The 2015 is a special year for Italy, because the country hosts Expo 2015, which is the current Universal Exposition. For this reason, the Italian PASS chapter promotes a special edition of SQL Saturday, a free training event for SQL Server professionals. The SQL Saturday #454 in Turin on October 10, 2015 has the following characteristics:

    • More than 20 sessions, on SQL Server, Business Intelligence and Azure Data Platform.
    • All the sessions will be in English language.
    • The venue is in the center of Turin, close to the train station:
      • You can be at the expo in 40 minute
      • You can travel to Milan in less than 1 hour
    • Turin is usually less expensive than Milan and you might stop for at least one night, dedicating the Sunday after SQL Saturday to visiting the Expo or Milan.

    We want to provide the best experience to the attendees, and we also want to help those of you traveling with family and/or friends that might not interested to technical content. For this reason, we are planning a web page containing information for side and/or alternative activity during the SQL Saturday. You will get more information about that starting in July.

    However, we first need a good estimation of the number of attendees, in order to correctly size the venue and to verify the interest in side activities, so we will module the time to allocate in such a section of the upcoming web site. These operations have to be completed months ahead of the event.

    For this reason, we ask you to fill the survey at http://www.sqlsatexpo.com/, providing us important information about your intention of visiting Expo 2015 and about the number of people who will travel with you.

    If you are a speaker, please submit your sessions, considering that the agenda will prioritize three topics: SQL Server 2016, Power BI, and Azure Data Platform.

    See you in Turin! 

  • Optimize Heap Memory Settings for Analysis Services Tabular 2012/2014 #ssas #tabular

    In the last months I assisted many companies implementing solutions based on Analysis Services Tabular. There is not so much difference between the versions 2012 and 2014, because SQL Server 2014 didn’t introduce new features to the BI services. Thus, my considerations are valid for both.

    One issue observed in different cases was a general performance degradation after a few days of work. Restarting the msmdsrv.exe service was enough to restore normal performance. The problem might affect both query and process operations. Microsoft released a hotfix (KB2976861) that mitigates the problem for slowness of full process, but it is not something that completely solve the problem.

    The real reason of the issue is the fragmentation of the memory heap. Analysis Services can use its own heap algorithm, or the standard Windows one. It seems that the workload generated by Tabular creating objects of a dynamic size is an issue for the Windows Low-Fragmentation Heap, which is the default setting in Analysis Services (because of a better scalability).

    In the Heap Memory Settings for Analysis Services Tabular 2012 / 2014 article on SQLBI you can find a complete description of the settings to control heap memory used by Analysis Services. If default values produces the symptoms described above, then consider changing them with the suggestions included in the article.

  • DAX measures in Power BI Designer - and new DAX syntax finally here

    The last update of Power BI Designer allows you to create measures (not calculated columns yet). Download the new version of Power BI Designer and you will see the New Measure button. The editor is much better than anything you have seen in Excel 2010/2013, but it can be improved (larger real estate is the first request).

    The real important fact is another. You have a new version of DAX in your hands. It is not just because you have a some new functions or because the engine is faster (way faster). No, the big change (which is not a breaking change, but just a new feature) are "variables". I'm not sure this is the right name, but it is the intuitive name you give to a feature where you use the keyword VAR before specifying an identifier. What are we talking about? Look at this example:

    Quantity :=
    VAR
        TotalQuantity = SUM ( Sales[Quantity] )
    RETURN
        IF (
            TotalQuantity > 1000,
            TotalQuantity * 0.95,
            TotalQuantity
        ) 

    You can assign an expression to an identifier within a larger DAX expression. The evaluation context is the one where you write the definition. You can avoid repeating the same expression multiple times within the same measure, and you can simplify the writing of code avoiding too many nested evaluations and avoid using EARLIER in most of the cases. For example, consider this expression

    = SUMX ( Sales, Sales[Date] <= EARLIER ( Sales[Date] ) )

    Now you can write:

    =
    VAR
        CurrentDate = Sales[Date]
    RETURN
        SUMX ( Sales, Sales[Date] <= CurrentDate )

    Which is longer, but way more readable.

    A longer and more detailed article about the new VAR / RETURN syntax in DAX is available at Variables in DAX on SQLBI. 

  • New features in DAX Editor #dax #tabular #ssas

    In the last few weeks I and Teo Lachev took the ownership of DAX Editor code. It is an add-in for Visual Studio that allows you to edit DAX measures in a text editor instead of using the Measure grid. It also provides other features (such as editing measures of an online Tabular database and performing a few queries). The user interface is not so user-friendly, so I suggest you to carefully read the documentation in order to understand how you can edit DAX measures in your own Visual Studio project.

    So, what are the new features we added in the last weeks?

    We also re-published the DAX Editor in Visual Studio Extension Gallery. If you had a previous version, please uninstall it and then install the new one. We had to change the internal VSIX ID so the upgrade to this new version doesn’t work automatically. However, once you install this new build, upgrade should be easier in future builds.

  • Create API for Power BI Designer #powerbi

    In the last months, I’ve been trying to suggest a direction for Power BI, but I start to realize that in these days Microsoft is really prioritizing features based on customer feedback. This can make things harder for a new idea to be prioritized, because nobody ask for something completely new. A few days ago, Jamie Thompson asked support for PBIX files saved on OneDrive. I think it’s a good idea, Excel is already supported there, and you can add your vote here: http://support.powerbi.com/forums/265200-power-bi/suggestions/7259274-view-pbix-files-on-onedrive-onedrive-4-business

    But I want to ask community support for requesting a feature in Power BI Designer that would help thousands of ISV and millions of users. So, Microsoft, please:

    Create a Power BI Designer API and support connecting to PBIX from Excel

    You can click on the link and vote it. If you are in a hurry, just to these 2 clicks and receive a big thank you! If you have more time, Please, let me elaborate it – I will include the description I published on UserVoice adding a few comments.

    Power BI Designer saves a local PBIX file, which can be a file to export data and data model – in other words, it’s a format containing a complete semantic model. All the applications that today export data in several formats (CSV, Excel, XML), might provide a richer semantic model exporting a PBIX file.

    Many ISV/SI that have OLTP and other applications that stores data in some database, usually struggle to offer a compelling BI story to their customers. The smaller they are, the more they feel this pressure because probably the effort they can put in their custom software is minimal.

    Today these ISV/SI integrate their solution with external vendor technologies (QlikView is a common choice here). However, the cost of such a solution for the end user is not always appealing, and for this reason the MS partner ecosystem always look for components (charts and pivot tables) to integrate in their solutions.

    Providing them an easy and inexpensive way to produce PBIX files “ready to use” straight from their product/solution would provide several benefits:

    • Customers would have something ready to be uploaded to Power BI service
    • ISV/SI would be able to provide a BI solution integrated with MS ecosystem
    • ISV/SI can implement solutions like “send a PBIX file via mail every week to all the agents including only the data of their prospects/customers”
      • Today they already do that using the .CUB format, which can be consumed by both Excel and custom applications
    • Microsoft would increase the number of Power BI users very quickly - Small ISV/SI would be able to implement such integration very fast

    What I propose to do is, in descending order of importance:

    1. Support Power BI Designer as a local engine with an API that can be used by anyone and officially support local connections by other programs (starting from Excel)
      • The API should provide the ability to create a data model and to populate it with data by just using API, without any manual interaction
      • Providing the ability to connect from other clients (today it is possible but not officially supported) would increase the adoption.
    2. Document and “open” the PBIX file, so that it can be generated by anyone
      • I think that this is easy for the data model, but not for the data.
      • But without the data, this model would be not so useful, requiring a manual refresh to be populated.
    3. Move Power BI Designer to open source
      • Not really a priority in my opinion, but if the first two wouldn’t be possible, this one could be ok

    If you think this is a good idea, here is the point. The top ideas are SQL Server on-prem and SQL Server Analysis Services cubes. Very important features. They have more than 1,000 votes. We need to go there to receive attention from Microsoft. Thus, your vote is not enough. Please, forward the message, convince other people to vote, add your comments, talk very loud.

    Microsoft is prioritizing cloud services, but getting quick and large adoption for small databases of thousands of applications, each one with hundreds of users, means generating a huge volume of data models ready to use. Yes, we need DAX in Power BI Designer in order to make it useful, but we know it is already in the roadmap. Microsoft released yesterday support for Google Analytics, and by now it works only in Power BI Designer. Working in a desktop app has its own advantages. I never had a so powerful tool to navigate in Google Analytics data. I’m waiting for DAX. But being able to generate a PBIX file from within an application would be a great leverage to Power BI adoption also for people who are not ready to query data or create a new data model, but that want to analyze their data.

    And, of course, if you think it’s a bad idea… comments are open, I’d like to hear other point of views.

    Thanks!

More Posts Next page »

This Blog

Syndication

Archives

Powered by Community Server (Commercial Edition), by Telligent Systems
  Privacy Statement