THE SQL Server Blog Spot on the Web

Welcome to SQLblog.com - The SQL Server blog spot on the web Sign in | |
in Search

Andy Leonard

Andy Leonard is an author and engineer who enjoys building and automating data integration solutions. Andy is co-host of the Data Driven podcast. Andy is no longer updating this blog. His current blog is AndyLeonard.blog.

Intelligent Data Integration, SSIS Design Patterns, and Biml

This blog has moved! You can find this content at the following new location:

http://andyleonard.blog/2017/02/04/intelligent-data-integration-ssis-design-patterns-and-biml-2/

Published Saturday, February 4, 2017 11:19 AM by andyleonard

Comment Notification

If you would like to receive an email when updates are made to this post, please register here

Subscribe to this post's comments using RSS

Comments

 

garry said:

hi Andy,

I dont know if this is the right question to post here, but i am trying to accomplish something which is not out of the box.

I was trying to build a package:

1)user enters the SQL.

2)The db source and destination are supplied through the Project parameter.

3) the package should load the data in the destination table , based on the SQL supplied by the user.

4)The hard part is to do the mapping in the script component , which changes with the source SQL.

5) i am able to prepare the sQL and the scrip using the Script component, but not able to do the mapping in the SCript component as it changes with every SQL

Do you think , that this can be accomplished using the script component or not.

I will appreciate any help from you.

thanks

Garry

March 28, 2017 12:43 PM
 

andyleonard said:

Hi Garry,

  "Pipelines" are created when an SSIS developer configures an adapter in a Data Flow Task. This happens when, for instance, you add an OLE DB Source adapter to a data flow and configure it to connect to a source database via an OLE DB Connection Manager. At that point, a data flow pipeline is created and configured to use the columns - by name and data type - specified in the OLE DB Source adapter.

  As far as I know (and I could be wrong), I don't believe it's possible to decouple pipelines and re-couple them dynamically.

  This is why changes to source or destination tables break data flows.

  There are other data integration tools that support "semantic rationalization," which is the problem you are trying to solve. Expressor sold their data integration tool to Qlik a few years ago, and it managed changes to underlying schemata. I'm not sure if Qlik still sells Expressor or not, I couldn't locate it on their site.

Hope this helps,

Andy

March 28, 2017 2:34 PM

Leave a Comment

(required) 
(required) 
Submit

This Blog

Syndication

News

My Latest Book:

Community Awards



Friend of Red Gate

Contact Me

Archives

Privacy Statement