Reputation: 13
I am developing a SSIS package, it has a data flow task which performs the job of getting data from a SQL Query and Inserting it into a SQL Table, below attributes are obtained from user defined variables which in turn get values from a SQL Server table, and the data flow task is in a for each loop to execute all the records from the table.
Metadata is not getting refreshed for new connections/Queries/Destination Table, is there a way this can be handled at run-time automatically ?
Upvotes: 0
Views: 2744
Reputation: 31785
No. If the Source queries and destination tables don't all have the exact same meta-data (column structure and data-types) then there is no way to use a loop to have only one dataflow that dynamically changes to handle all of the tables.
The column-mapping of a dataflow must be specified at design-time and cannot be changed at run-time.
The closest thing to what you want is BiML, which lets you dynamically create packages based on the metadata you wish to import.
Upvotes: 0