Reputation: 157
I am trying to create a copy activity between two Azure Data Lakes GEN1. I have to make the copy over a path where one of the subfolders is varible, for example:
rootFolder/subFolder1/*/subFolder3
where * can take different values, and the copy has to be made automatically for all these possible values, so it was not worth setting that subfolder as a parameter and running the pipeline as many times as the number of possible values.
I would like to know if there is a way to implement this copy activity automatically, I am new to Azure and ADF.
Upvotes: 0
Views: 1304
Reputation: 1422
This can be achieved by using Wildcard filtering in the source settings of your Copy Activity.
Set Wildcard Folder path = rootFolder/subFolder1/*/subFolder3
Wildcard File name = * or *.json or *.txt or *.csv.., etc based on your requirement.
For Example:
To know more about resulting behavior of the folder path and file name with wildcard filters, please refer to this MS Doc: https://learn.microsoft.com/azure/data-factory/connector-azure-data-lake-store#folder-and-file-filter-examples
Here are few threads related to similar requirement which might be helpful:
Upvotes: 1