2tone_tony
2tone_tony

Reputation: 47

How to bulk copy multiple csv files from blob to mutliple sql database tables with data factory

I am trying copy different csv files in blob storage into there very own sql tables(I want to auto create these tables). Ive seen alot of questions but I haven't seen any that answer this.

Currently I have a getmetadata function that grabs a list of child items to get the name of the files and a foreach loop but from there I don't know how to have them sent to different tables per file.

Upvotes: 1

Views: 3002

Answers (1)

Joseph  Xu
Joseph Xu

Reputation: 6083

Updated:
When I run it for a 2nd time. It will add new rows into the table.
enter image description here


I created a simple test and it works well. This is my csv file stored in Azure Data Lake.
enter image description here

Then we can use pipeline to copy this csv file into Azure SQL table(auto create these tables).

  1. At GetMetaData1 activity, we can set the dataset of the folder containing csv files enter image description here And select First row as header at the dataset. enter image description here

2.At ForEach1 activity we can foreach the file list via expression @activity('Get Metadata1').output.childItems. enter image description here

3.Inside ForEach1 activity, we can use Copy data1 activity with the same data source as GetMetaData1 activity. At source tab, we can type in dynamic content @item().name. We can use @item().name to get the file name. enter image description here

  1. At sink tab, we should select Auto create table. enter image description here In the Azure SQL dataset, we should type in schema name and dynamic content @replace(item().name,'.csv','') as its table name. Because this information is needed to create a table dynamically. enter image description here

  2. The debug result is as follows:
    enter image description here enter image description here

Upvotes: 5

Related Questions