THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

Davide Mauri

A place for my thoughts and experiences on SQL Server, Business Intelligence and .NET

  • Devweek 2016

    I’m really happy to announce that I’ll be back in London, at the DevWeek 2016 Conference, in April. I’ll be talking about

    Though the conference name may imply that it’s dedicated to Developers, in reality there are *a lot* of interesting sessions on Databases, Big Data and, more in general, the Data Management and Data Science area.

    Here’s the Agenda

    http://devweek.com/agenda

    I’ll be there along with another well-known name of this blog, Dejan Sarka, just to make sure that the BI/Big Data/Data Science and the likes is well represented among all those developers Smile.

    See you there!

  • (Initial) Conference Plan for 2016

    2016 has not started yet and already looks exciting to me! I already have plans for several conferences and I’d like to share it with you all in case you’re interested in some topics.

    I’ll be presenting at Technical Cloud Day, on January 26th a local Italian event and I’ll be speaking about

    • Azure Machine Learning
    • Azure Stream Analytics

    If you’re interested (and speak Italian) here’s the website:

    http://www.technicalcloudday.it/

    I’ll also be present at some international events, like

    SQL Konferenz

    here I’ll delivery my “classic” Agile Data Warehousing workshop, during the Pre-Con days.

    • Why a Data Warehouse?
    • The Agile Approach
    • Modeling the Data Warehouse
      • Kimball, Inmon & Data Vault
      • Dimensional Modeling
      • Dimension, Fact, Measures
      • Star & Snowflake Schema
      • Transactional, Snapshot and Temporal Fact Tables
      • Slowly Changing Dimensions
    • Engineering the Solution
      • Building the Data Warehouse
        • Solution Architecture
        • Naming conventions, mandatory columns and other rules
        • Views and Stored Procedure usage
      • Loading the Data Warehouse
        • ETL Patterns
        • Best Practices
      • Automating Extraction and Loading
        • Making the solution automatable
        • BIML
    • Unit Testing Data
    • The Complete Picture
      • Where Big Data comes into play?
    • After the Data Warehouse
      • Optimized Hardware & Software
    • Conclusion

    You can find more here:

    http://sqlkonferenz.de/agenda.aspx

    I’ll also have a regular session dedicated to SSISDB and its internals: SSIS Monitoring Deep Dive. I’ll show what’s inside and how you can use such knowledge to build (and improve) something like my SSIS Dashboard: http://ssis-dashboard.azurewebsites.net/

    SQL Nexus

    This is a new Nordic Conference where I’ll deliver along with Allan Mitchell where I’ll be presenting a new, super-cool IMHO, workshop. We’ll discuss about the Lambda Architecture, a new generic reference architecture to build Real-Time Analytics solution, and how it can be built using the features that Azure offers. We’ll show how to use Azure Event Hubs, Stream Analytics, Data Lake Power BI and may other cool technologies from Azure.

    You can find more details here:

    Reference Big Data Lambda Architecture in Azure
    The Lambda Architecture is a new generic, scalable and fault-tolerant data processing architecture, that is becoming more and more popular now that big data and real-time analytics are frequently requested by end users, enabling them to make informed decisions more precisely and quickly. During this full-day workshop we'll see how the Azure Data Platform can perfectly support such an architecture and how to use each technology to build it. From Azure IoT Hub and Azure Stream Analytics to Azure Data Lake and Power BI, we'll build a small Lambda-Architecture solution so that you'll be able to become confident with it and its implementation using Azure technologies.

    http://www.sqlnexus.com/pre--and-main-conference.html

    Well if you’re interested in one or more of these topics, you know where to go now. Bye!

  • Custom Data Provider in Datazen

    Playing with Datazen in the last days, I had to solve a quite interesting problem that took me some time but also allowed me to dig deeper into Datazen architecture in order to find a way to go past its (apparent) limits.

    Here’s the story, as I’m sure will be useful to someone else too.

    One of our current customer has a quite complex Analysis Services dynamic security. Beside applying security on who is accessing the data, they also want to apply security based on how someone access such data. In order to satisfy this requirement a specific extension to Excel (their chosen client) has been developed, and it uses the CustomData() MDX Fuction.

    So, here’s the problem: how can I specify values for CustomData property in the SSAS connection string in DataZen, as there is no such property exposed by default from the native SSAS data provider?

    Luckily DataZen support custom data providers, so it’s quite easy to create a new one that exposes the properties you need:

    http://www.datazen.com/docs/?article=server/managing_data_provider_schemas

    I’ve tried to go for the “Overriding built-in data providers” road but I wasn’t able to make it work. I tried to add the “CustomData” property to a file that overrides the default SSAS data provider setting but at the end the “CustomData” property was the only option I was available to see in the overridden native provider. So I created a new SSAS Data Provider and that’s it, everything works perfectly:

    <dataproviderschema>
        <id>MSSSAS</id>
        <enabled>true</enabled>
        <name>SSAS.EPSON</name>
        <type>ssas</type>
        <properties>
            <property>
                <name>Provider</name>
                <value>MSOLAP</value>
            </property>
            <property>
                <name>Data Source</name>           
            </property>
            <property>
                <name>Initial Catalog</name>
            </property>
            <property>
                <name>CustomData</name>
                <value>{00000000-0000-0000-0000-000000000000}</value>
            </property>
        </properties>
    </dataproviderschema>

    Be aware that Datazen do *a lot* a caching so you’ll have to stop the Core service BEFORE you edit/create the XML file, otherwise you may find it overwritten with cached data, and also be sure to IISRESET your web server otherwise you can easily get mad trying to understand why what you’ve just done is not showing up in the UI.

    Beside the caching madness, everything works great.

    Hope this helps!

  • Configuring Pass-Through Windows Authentication in Datazen

    I’ve been working with Datazen lately (I’m working with a customer that literally felt in love with it) and one of the last thing we tried as a port of a POC before going into real development, is integration with Windows Authentication.

    It’s really easy to do that, you just follow instructions here (in the section “Authentication Mode”)

    http://www.datazen.com/docs/?article=server/installing_server

    and it just works. As documentation suggest, you just have to specify the domain name and that’s it.

    Of course, after that, you may also want to enable pass-through authentication, so that once a user tries to access a dashboard via HTML interface, Datazen will use the logon credential, without going through and additional logon screen.

    Here things can be tricky if you just follow that documentation here:

    http://www.datazen.com/docs/?article=server/configuring_integrated_windows_authentication

    which is correct but only to a certain degree. Everything is correct, it’s only missing to say a *very* important thing that you have to know to make sure that it works as expected: you have to provide ALL FOUR SETTINGS (Server, UserName, Domain, Password) in order to make it work.

    If you forgot to do it during installation, no problem, you can do it later setting the

    • ad_server
    • ad_username
    • ad_domain
    • ad_password

    configuration values as explained here:

    http://www.datazen.com/docs/?article=server/server_core_settings

    After that, the magic happens, and everything works perfectly

    PS

    Of course you have to have configured Kerberos Authentication and Delegation correctly, but that’s another story.

  • SQL Server 2016 CTP 2.3: Management Studio and Data Tools Updates

    After SQL Server 2016 CTP 2.3 has been released, also Management Studio and SQL Server Data Tools has been updated too. Having three different teams working on three different products, means three different places one has to look for to become aware of the updates, so make it easier for everyone who’s asking, here the complete set of link to have a full SQL Server 2016 CTP 2.3 installation (SQL Server Platform + All the Clients):

    SQL Server 2016 CTP 2.3

    SQL Server 2016 Management Studio Preview August 2015 Update

    SQL Server 2016 Data Tools (with support for both Database & Business Intelligence) Preview August 2015 Update

    Enjoy!

  • SQL Saturday in Italy…English version!

    The 2015 is a special year for Italy, because the country hosts Expo 2015, which is the current Universal Exposition. For this reason, the Italian PASS chapter promotes a special edition of SQL Saturday, a free training event for SQL Server professionals. The SQL Saturday #454 in Turin on October 10, 2015 has the following characteristics:

    • More than 20 sessions, on SQL Server, Business Intelligence and Azure Data Platform.
    • All the sessions will be in English language.
    • The venue is in the center of Turin, close to the train station:
      • You can be at the expo in 40 minute
      • You can travel to Milan in less than 1 hour
    • Turin is usually less expensive than Milan and you might stop for at least one night, dedicating the Sunday after SQL Saturday to visiting the Expo or Milan.

      We want to provide the best experience to the attendees, and we also want to help those of you traveling with family and/or friends that might not interested to technical content. For this reason, we are planning a web page containing information for side and/or alternative activity during the SQL Saturday. You will get more information about that starting in July.

      However, we first need a good estimation of the number of attendees, in order to correctly size the venue and to verify the interest in side activities, so we will module the time to allocate in such a section of the upcoming web site. These operations have to be completed months ahead of the event.

      For this reason, we ask you to fill the survey at http://www.sqlsatexpo.com/, providing us important information about your intention of visiting Expo 2015 and about the number of people who will travel with you.

      If you are a speaker, please submit your sessions, considering that the agenda will prioritize three topics: SQL Server 2016, Power BI, and Azure Data Platform.

      See you in Turin!

    • SQL Konferenz 2015 Slide & Demo

      Last week I spoke at the SQL Konferenz in Darmstad near Frankfurt. The conference was great and I meet a lot of good SQL friends over there. For anyone interested here you can find slide & demos of the session I delivered:

      (Near) Real-Time Data Integration with SQL Server, On-Premises & Cloud
      http://www.slideshare.net/davidemauri/real-time-data-integration

      Schema-Less Table & Dynamic Schema
      http://www.slideshare.net/davidemauri/schema-less-table-dynamic-schema-44295422

      You’ll find a link to evaluated the session on SpeakerScore and to download the slides in the last slide of each deck.

      Enjoy!

    • Iris Multi-Class Classifier with Azure ML

      As many of us I’m passionate about informatics *and* mathematics which, of course, lead me to be passionate about the outcome of their marriage: Databases and Machine Learning.

      Now that Machine Learning is becoming a kind of a “commodity” thanks to AzureML I can finally start to use it in any projects, even the not-so-big-ones.

      AzureML, for those who doesn’t yet know it, is the Machine Learning offer for the cloud by Microsoft. You can freely start to use it just activating your subscription here:

      https://studio.azureml.net/Home

      Once activated you’ll find a lot of ready-to-be-used stuff. From “experiments” (kind of “programs”) and dataset and components and models (algorithms).

      One thing I noticed is missing is the full Iris Dataset, one of the most famous and used dataset to start to learn machine learning. In AzureML you can find a subset of it, usable for binary classification, but the original one is much more interesting since it can be used to do a multiclass classification.

      In order to fill this little gap and to create an easy tutorial to help everyone to start to get confident with AzureML and machine learning in general, I’ve created a 10-Step (well…Italian way of 10 steps Winking smile) tutorial that can be found here:

      http://www.slideshare.net/davidemauri/iris-multiclass-classifier-with-azure-ml 

      or here

      https://speakerdeck.com/yorek/iris-multi-class-classifier-with-azure-ml

      choose the website you prefer Smile and start to play!

      As usual, comments and feedbacks are more than welcome!

    • SSISDB Monitoring Queries on GitHub

      I’ve moved my SSISDB scripts from Gist to GitHub where I can maintain them more comfortably. So far, I’ve published 6 scripts:

      • ssis-execution-status: Latest executed packages
      • ssis-execution-breakdown: Execution breakdown for a specific execution
      • ssis-execution-dataflow-info: Data Flow information for a specific execution
      • ssis-execution-log: Information/Warning/Error messages found in the log for a specific execution
      • ssis-execution-lookup-cache-usage: Lookup usage for a specific package/execution
      • ssis-execution-package-history: Execution historical data

      I used them almost every day when I need to have a quick glance to what’s going on on Integration Services and when I need to do some deep analysis of errors and problems.

      You can find them here:

      https://github.com/yorek/ssis-queries

      If you’re also wondering what happened to the SSIS Dashboard project

      https://github.com/yorek/ssis-dashboard

      …don’t fear, it’s not dead. I’m still working on it, but since I’m working on it only in my free time, updates are taking much more time than expected.

      PS

      Funny enough, Andy Leonard published a script to analyze lookups just couple of hours before me. You may also want to take a look at his post: http://sqlblog.com/blogs/andy_leonard/archive/2015/01/16/advanced-ssis-parsing-ssis-catalog-messages-for-lookups.aspx

    • SQL Konferenz 2015

      On the first days of February I’ll be speaking at the German SQL Konferenz 2015 in Darmstad (Frankfurt) with a lot of other friends:

      http://www.sqlkonferenz.de/agenda.aspx

      I’ll be talking about two topics that my developer side love at most : Service Broker and “Dynamic Schema”:

      Schema-less table & Dynamic Schema
      How to manage a system in which the schema of data cannot be defined "a priori"? How to quickly search for entities whose data is on multiple lines? In this session we are going to address all these issues, historically among the most complex for those who find themselves having to manage yet very common and very delicate with regard to performance. From EAV to Sparse Columns, we'll see all the possible techniques to do it in the best way possible, from a usability, performance and maintenance points of view.

      Real Time Data Integration (in the Cloud or not)
      Service Broker and Integration Services can work so well together that they allow the creation of high-performance Real Time Data Integration solution with just a few days of work. No matter if you're on premise or on Azure, a real-time integration will open up new opportunities to deliver data and information faster and more efficiently, empowering the end user with all they need to do a great job. Let's say that your ERP software is on premise and you need to create a real-time dashboard in the Cloud...or that you have to integrate with your cloud-based sales force management solution. Do you really think that a batch update every 15 minutes can be solution, while for the same price you can have something done in real-time? In this session we'll see how to build such solution (that allowed one of our customer to completely replace TIBCO), from start to end.

      See you there!

    • Expired Account Password on a Azure VM

      Today I faced a really nasty problem. I’m really getting in love with Azure and especially with SQL Server hosted in Azure VM. It opens up a huge amount of opportunities, for small, medium and big companies, since they can have everything they ask for but without the burden of having to maintain a server factory.

      That’s very cool, but the inability to physically log into server can give you some headaches if RDP doesn’t work as expected. For example when you’re not in a domain and your password expires. It seems that no-one in Microsoft cared to fix the problem, since is still there even if people reported it back in 2013

      http://www.flexecom.com/unable-to-change-password-logging-into-an-azure-hosted-virtual-server/

      Today I had exactly the same problem. At some point the RDP client started to return me the error

      “The Local Security Authority Cannot be Contacted”

      After having spent some time trying to find out what could be the cause of the error (even following some wrong roads, given the fact that the error is just too generic), I thought that could be due to the fact that the password was expired. And that was exactly the problem. This post (even older than 2013, so the problem is even older….) http://blog.mnewton.com/articles/Solution-RDP-The-Local-Security-Authority-cannot-be-contacted/ confirmed me that my idea could be correct.

      Unfortunately the aforementioned posts states the problem, but doesn’t really describe how to solve it in my specific case. The main problem is that if the server requires the Network Level Authentication, the RDP client won’t show you the “Password Expired” screen, so you won’t be able to change the password. This means that you cannot access your VM anymore, which is not fair. By default NLA is enabled on Windows Server 2012 R2 and since I couldn’t log in, I couldn’t even disable it, so I was stuck with my problem.

      Anyway, at least now I know where to look for. Still, I had to solve another problem: how do I change a password for an Azure VM to which I cannot connect using RDP? Luckily it seems that there are a lot of people that forgot their passwords, and so they need to reset it, so the problem is well known. Here there are two post that explain how to do it using PowerShell and the related Azure PowerShell Module.

      http://serverfault.com/questions/446699/how-to-reset-the-admin-password-on-vm-on-windows-azure

      http://blogs.technet.com/b/keithmayer/archive/2014/06/26/microsoft-azure-virtual-machines-reset-forgotten-admin-password-with-windows-powershell.aspx

      The PowerShell script works if and only if the VM Agent is installed. Luckily this is the default option when you provision a new Azure VM, so you haven’t anything special do to in order to have it installed.

      http://azure.microsoft.com/blog/2014/04/11/vm-agent-and-extensions-part-1/

      Well, now you know it, keep it in mind in case you find yourself in the same situation.

    • Using NLog With BIDS Helper to add logging to BIML Script executions

      When using BIML within BIDS Helper, if your BIML Script files get complex it may be quite hard to debug BIML Script Execution in order to understand what’s going on behind the scenes.

      On Windows Server 2008 it was possible to use the System.Diagnostic.Debug.WriteLine and DebugViewer from SysInternal to do the trick but it seems that this approach doesn’t work anymore on Windows Server 2012. Or, at least, I wasn’t able to make it work. Anyway, in addition to that, I was also trying to have everything logged to a file so one doesn’t have to use and configure DebugViewer to do its job. DebugViewer is great tool, but it’s not really suitable for junior developers.

      So I tried to use the fantastic NLog framework in order to create a “standard” way of logging BIML. First of all download it (just get the standard version, no need to grab the “Extended” one) and unpack into a folder named “NLog-3.0” where you prefer.

      Now in your SSIS Solution create a BIML file named “BimlLogger.biml” and copy and paste the following code:

      <#@ template language="C#" #>
      <#@ assembly name="C:\Work\Lab\BIML-Debug\NLog\NLog-3.0\net40\NLog.dll" #>
      <#@ import namespace="System.Diagnostics" #>
      <#@ import namespace="NLog" #>
      <#@ import namespace="NLog.Config" #>
      <#@ import namespace="NLog.Targets" #>

      <#
          Logger logger = LogManager.GetLogger("BIML");

          LoggingConfiguration config = new LoggingConfiguration();

          FileTarget fileTarget = new FileTarget();
          config.AddTarget("file", fileTarget);

          fileTarget.FileName = @"${nlogdir}\..\..\biml.log";
          fileTarget.Layout = "${longdate}|${level:uppercase=true}|${message}";
          fileTarget.DeleteOldFileOnStartup = true;

          LoggingRule loggingRule = new LoggingRule("*", LogLevel.Trace, fileTarget);
          config.LoggingRules.Add(loggingRule);

          LogManager.Configuration = config;   
      #>

      Change the second line (the “assembly name” line) in order to reflect the path where you unpacked NLog and then you can start to log anything you think it can help you just referencing this file from any other BIML files and then using the “logger” object to write to the log file. Here’s an example of a BIML script file that creates a test package.

      <#@ template language="C#" #>
      <#@ include file="BimlLogger.biml" #>

      <#
          logger.Info("Generating Package...");
      #>

      <Biml xmlns="http://schemas.varigence.com/biml.xsd">
        <Packages>
          <Package Name="Test" ConstraintMode="Linear" ProtectionLevel="EncryptSensitiveWithUserKey" />
        </Packages>
      </Biml>

      <#
          logger.Info("Done.");
      #>

      The log file will be created in the “NLog-3.0” folder you created before. Of course you can change this and many other option, since NLog is really flexible and powerful. Documentation and tutorial are here: https://github.com/nlog/nlog/wiki

      2014-12-13 Update

      In order to have the trick working, you have to be sure that the NLog assembly is *not* blocked…which is something will happen automatically if you download the zip from internet. In order to unblock the assembly you have to right-click on it and then select “Unblock”:

      image

    • PASS Summit 2014 Pre-Con Preview: Davide Mauri

      If you’re into Data Warehousing, you may be interested in attending to the workshop I’ll deliver at PASS Summit 2014 in Seattle on November 4th.

      http://www.sqlpass.org/summit/2014/Sessions/PreConferenceSessions.aspx

      The workshop is entirely dedicated to explaining why and how a *successful* Data Warehouse can be thought, designed, architected, built, loaded and tested, using the  Agile approach that, so far, has mainly be applied to the application development field and in the last year has gained traction also (and finally I would say) in the BI field. Both Gartner and Forrester also underline that the Agile is a key factor for success in modern BI world, since has been verified that 50% of the requirement change in the first year in a BI project.

      If you want to read more about the workshop, we you read the Q&A just published here:

      http://www.sqlpass.org/Community/PASSBlog/tabid/1476/entryid/676/PASS-Summit-2014-Pre-Con-Preview-Davide-Mauri.aspx

      In addition to that I’d also like to share the agenda of the workshop, that will give you even more information on what we’ll discuss on that day:

      • Why a Data Warehouse?
      • The Agile Approach
      • Modeling the Data Warehouse
        • Kimball, Inmon & Data Vault
        • Dimensional Modeling
        • Dimension, Fact, Measures
        • Star & Snowflake Schema
        • Transactional, Snapshot and Temporal Fact Tables
        • Slowly Changing Dimensions
      • Engineering the Solution
        • Building the Data Warehouse
          • Solution Architecture
          • Naming conventions, mandatory columns and other rules
          • Views and Stored Procedure usage
        • Loading the Data Warehouse
          • ETL Patterns
          • Best Practices
        • Automating Extraction and Loading
          • Making the solution automatable
          • BIML
      • Unit Testing Data
      • The Complete Picture
        • Where Big Data comes into play?
      • After the Data Warehouse
        • Optimized Hardware & Software
      • Conclusions

      As you can see it will be a fully packed day...so brings two cups of coffee and you'll be good :)

      See you in Seattle!

    • Sketch notes from 24 Hours of PASS

      24 Hours of PASS has passed and, beside the slides, demo and video (that will come soon on http://www.sqlpass.org/24hours/2014/summitpreview/Schedule.aspx), this time, thanks to Matt Penny (@salisbury_matt) you can also have very nice and well done sketch notes that summarizes the concept of the sessions Matt attended to, in a very nice, quick, effective and friendly way. Here’s what Matt did for my session:

      BxKPva4IEAAVZ_f

      I love it! I must say I’m a fan on sketch notes. It’s quite an art on its own IMHO, ‘cause good sketch notes mix written and visual language such in a way that make much more easier to the read to get the message and memorize it. I’ll be using the notes that Matt took for my session quite a lot in future, for sure.

      Beside notes of my session, you can find here

      http://mattypenny.net/2014/09/10/sketchnotes-from-24-hours-of-pass/

      sketch notes for the following sessions:

      • Brent Ozar on ‘Developers: Who Needs a DBA?’
      • Brian Knight on ‘Performance Tuning SQL Server Integration Services (SSIS)’
      • Allan Hirt on ‘Availability Groups vs. Failover Cluster Instances: What’s the Difference?’
      • Erin Stellato, Jonathan Kehayias on ‘Everything You Never Wanted to Know about Extended Events’
      • Gail Shaw on ‘Guessing Games: Statistics, Heuristics, and Row Estimations’
      • Tim Chapman, Denzil Ribeiro on ‘Troubleshoot Customer Performance Problems Like a Microsoft Engineer’
      • Argenis Fernandez on ‘Secure Your SQL Server Instance without Changing Any Code’
      • Joe Webb on ‘Hiring the Right People: Interviewing and Selecting the Right Team’
      • Robert Cain, Bradley Ball, Jason Strate on ‘Zero to Hero with PowerShell and SQL Server'
      • Chris Shaw, John Morehouse on ‘Real World SQL 2014 Migration Path Decisions’
      • Julie Koesmarno on ‘”I Want It NOW!” Data Visualization with Power View’
      • Jen Stirrup on ‘Business Intelligence Toolkit Overview: Microsoft Power BI and R’
      • Ryan Adams on ‘SQL Server AlwaysOn Quickstart’

      Thanks Matt!

    • SSIS Dashboard v 0.6.1

      Yesterday I’ve released the latest SSIS Dashboard update. There quite a lot of new features included that I found to very useful when you have a server full of packages and logs. Here the complete list: Highlighted the feature I think worth the most:

      • Updated Morris.js to v 0.5.1
      • Updated MetisMenu to v 1.1.1
      • Added information on "Child" Packages
      • Added more detail to the "Package Execution History" page. Also added an estimated end time / elapsed time for running packages, using a moving average of 7 steps.
      • Added navigation sidebar in the main page that shows available folders and projects
      • Added support for folders and project filtering
      • Changed configuration file in order to comply with Python/Flask standards
      • Cleaned Up code in order to follow Python best practices (still a lot to do :))

      Have you had a chance to give it a try? What features you’d like to see added?

      My plans for the next releases is to

      • Add a configuration page so that you can choose the maximum number of rows return (now set to 15) and the time interval you want to analyze (not set to 72 hours in the config file)
      • Use a EWMA instead of the simple Moving Average
      • Do a video to show how to install and use the Dashboard
      • Package everything in only one executable file / directory / VM (I want to be able to offer a xcopy deplyment “all-included”…not only the .py files)
      • Include additional information taken from [event_message_context], [executable_statistics], [execution_parameters_values]
      • Fix the layout for small / medium screens (smartphones / tablet)
      • Add historical / average elapsed time also for Child Packages and Executables
      • Include DataFlow informations

      Once all those things will be do, version 1.0 will be ready

      If you want to help, fork the code from Github:

      https://github.com/yorek/ssis-dashboard

      if you want to try it go here

      http://ssis-dashboard.azurewebsites.net/

    More Posts Next page »

    This Blog

    Syndication

    Powered by Community Server (Commercial Edition), by Telligent Systems
      Privacy Statement