Reputation: 163
I'm starting to migrate some on-prem operational data to Azure blob using azure data factory. I have a couple of complicated (well I think they are) pieces to the puzzle.
I have 8 databases, each of which are identical (our operational system works across 8 different sites and each site has it's own database) but contain data pertaining to that site only.
I need to consolidate data from these sites together. For example, a table called People, when I report on People I need ONE table containing all people from every site. I achieve this in our current on-prem data warehouse using UNION ALL SQL queries, but with over 60 queries, these become a pain in the rear to maintain as they change each year due to how our business operates.
My mind is boggling with the best/simplest way to achieve this and would appreciate a point in the right direction.
Upvotes: 1
Views: 176
Reputation: 6063
As @HimanshuSinha-msft you can paramterized the linked serveice.
In the pipeline, we can define an array type parameter servers
and type in
[{"server":"<server-name1>","db":"<dbname1>"},{"server":"server-name2","db":"<dbname2>"}, ...]
. This is a data structure I defined.
Then we can traverse this array parameter in the Foreach activity via @pipeline().parameters.servers
.
Inside the Foreach activity, we need to add two parameters to the source data set of the Copy acitivity.
Through these steps, we can use one Copy activity to copy data from multiple different SQL Servers to one Azure SQL. Because we dynamically passed in the value of the connection parameter.
Upvotes: 1