Reputation: 2157
We have two files that are ^ delimited file and a comma separated txt files which are stored in the Blob Storage like below
File1 fields are like
ItemId^Name^c1^type^count^code^Avail^status^Ready
File2 Fields are like
ItemId,Num,c2
Here the first column in both the files are the key and based on it I need to insert them in to one table on the Azure DB using the Azure Data Factory. Can anyone suggest how can this be done in the ADF. Should we merge the two files into one file before inserting into the Database.
AzureDB columns are
ItemId Name c1 type count code Avail status Ready Num c2
So it should be like
Item1 ABC(S) 1234 Toy 10 N N/A POOL N/A 19 EM
Item2 DEF(S) 5678 toy 7 X N/A POOL N/A 6 MP
I was referring to this Merging two or more files from a storage account based on a column using Azure Data Factory but couldnt understand if we can merge the two files before inserting in to DB
Upvotes: 0
Views: 657
Reputation: 15215
You can use the 2 files to create 2 datasets, use join activity to jointhem together and simply sink to the SQL table in a dataflow.
Here Inner join is used, you can adapt to use the type of join your preferred.
You can see the preview of the join successfully merged the 2 files/data sources.
Adjust the field mapping in Sink if needed.
Here is the arrow-separated.csv I used:
ItemId^Name^c1^type^count^code^Avail^status^Ready
Item1^ABC(S)^1234^Toy^10^N^N/A^POOL^N/A
Item2^DEF(S)^5678^toy^7^X^N/A^POOL^N/A
Here is the comma-separated.csv I used:
ItemId,Num,c2
Item1,19,EM
Item2,6,MP
Upvotes: 1