Reputation: 109
In data factory, I know you can pass a parameter at the beginning of a pipeline, and then access it later using @pipeline(). If I have a folder in a data lake store, how can I pass that as a parameter and have access to it later (let's say I want to loop a for-each over each file inside it.) Do I pass the path to the folder? Am I passing it as an object?
Upvotes: 1
Views: 1203
Reputation: 2363
Then you need create a data lake store dataset reference that linked service in step 2.
Then you create a getMetaData activity reference dataset in step 2.
Then following steps provided by summit.
All of these can be done in UI.https://learn.microsoft.com/en-us/azure/data-factory/quickstart-create-data-factory-portal#create-a-pipeline
Upvotes: 1
Reputation: 300
Here are the steps that you can use -
You can use pass folder path as a parameter (string) to the pipeline.
Use the path and "Get Metadata" activity with "Child Items". This will return the list of files in JSON Format
Loop through using "Foreach" activity and perform any action.
Use output from metadata activity as Items in Foreach activity (example below)
@activity('Get List of Files').output
Hope this helps
Upvotes: 1