Reputation: 1258
I have created an azure data factory pipeline to copy the data from one adls container to another adls container using copy data activity. This copy activity will trigger using a storage event trigger.
So whenever a new file gets generated, it will trigger the activity.
The source file is located in a nested directory structure having dynamic folders such as year, month, and day, which vary based on date.
In the trigger, I mentioned the path until the fixed folder path, but I don't know what value I should put for the dynamic path.
Initially, I provided the path such as my/fixed/directory/*/*/*/
,
but at the time of execution, it throws the exception 'PathNotFound'.
So my question is - How can I provide the path to the storage event trigger with the dynamic folder structure?
Following is ADF copy data pipeline screenshot:
Pipeline-
Copy data activity source configuration-
Copy data activity target configuration-
Copy data activity source dataset configuration-
Copy data activity target dataset configuration-
Upvotes: 1
Views: 2455
Reputation: 6124
blob path begins with
or blob path ends with
in storage event triggers.input/folder/2022
is my fixed directory (input is container name). I also have sub folders within each of the folders shown below.folder path: @replace(dataset().folder_name,'input/','')
file name: @dataset().file_name
data
:folder path: @concat('output/',replace(dataset().folder,'input/folder/',''))
file name: @dataset().file
folderName
and fileName
will be set while creating trigger as shown below:fileName : @triggerBody().fileName
folderName : @triggerBody().folderPath
folder/2022
the pipeline will be triggered.folder/2022/03/01/sample1.csv
. This triggered the pipeline successfully.So, creating a storage event trigger for just the parent directory is sufficient to be able to trigger the pipeline for any file uploaded to child directories as well.
Upvotes: 2