Reputation: 47
I am trying copy different csv files in blob storage into there very own sql tables(I want to auto create these tables). Ive seen alot of questions but I haven't seen any that answer this.
Currently I have a getmetadata function that grabs a list of child items to get the name of the files and a foreach loop but from there I don't know how to have them sent to different tables per file.
Upvotes: 1
Views: 3002
Reputation: 6083
Updated:
When I run it for a 2nd time. It will add new rows into the table.
I created a simple test and it works well. This is my csv file stored in Azure Data Lake.
Then we can use pipeline to copy this csv file into Azure SQL table(auto create these tables).
GetMetaData1
activity, we can set the dataset of the folder containing csv files
First row as header
at the dataset.
2.At ForEach1
activity we can foreach the file list via expression @activity('Get Metadata1').output.childItems
.
3.Inside ForEach1
activity, we can use Copy data1
activity with the same data source as GetMetaData1
activity. At source
tab, we can type in dynamic content @item().name
. We can use @item().name
to get the file name.
At sink
tab, we should select Auto create table
.
In the Azure SQL dataset, we should type in
schema name
and dynamic content @replace(item().name,'.csv','')
as its table name. Because this information is needed to create a table dynamically.
Upvotes: 5