Reputation: 39
I have folder structure as below in the Azure data lake (blob storage)-->
(landing > app > delta > table1 > 2021 > 05 > ABC*.csv)
(landing > app > delta > table2 > 2021 > 05 > DEF*.csv)
Each month there will be a new folder that will be created for all tables.
For Ex: landing > app > delta > table1 > 2021 > 06 > XYZ*.csv
There are several files in each sub-folder,
I want to copy these files to another location (archive), maintain the same structure, keep appending the files to the archive folder, as and when new folder gets created that structure need to be copied over in archive folder and files need to be moved over as well. Once the files are moved from source folder they need to be deleted.
(landing > app > archive > table1 > 2021 > 05 > ABC*.csv)
(landing > app > archive > table2 > 2021 > 05 > DEF*.csv)
How can this archival process can be achieved in Azure Data Factory? Please advise. Any help is appreciated.
Thanks.
Upvotes: 0
Views: 2846
Reputation: 8680
Please try this:
1.create Source dataset, select Binary format(Only Binary format has deleteFilesAfterCompletion setting).
2.create Sink dataset, select Binary format too.
3.create Copy Data activity and set the setting like the following screenshot:
Finally, you need to create a blob created event trigger and publish them.
Upvotes: 1