Reputation: 1603
In Databrick Notebook, I would like to know which file trigerred pipeline.
I have prepared Azure Data Factory pipeline. It got blob event based trigger and It runs databricks notebook. In databricks notebook I would like to know which file trigerred pipeline[its path as well]
Trigger itself works, as it triggers when i upload a file.
I have prepared pipeline parameters of type string and values like SourceFile string @triggerBody().fileName SourceFolder string @trigger().outputs.body.folderPath
In task i have prepared parameters like: SourceFile @pipeline().parameters.SourceFile SourceFolder @pipeline().parameters.SourceFolder
In Databricks Notebook such a code:
dbutils.widgets.text("SourceFolder", "","")
y = dbutils.widgets.get('SourceFolder')
print ("Param -\'SourceFolder':")
print (y)
prints this:
Param -'SourceFolder': @trigger().outputs.body.folderPath
I would like to see path or file name not command.
Upvotes: 0
Views: 995
Reputation: 11329
I tried the above in my environment and it is working fine for me.
I created two parameters foldername
and filename
.
I have created the trigger like below.
Give the trigger parameters @triggerBody().folderPath
and @triggerBody().fileName
to the pipeline parameters like below.
Then I have created a set variable to create the file path with below dynamic content.
@concat(pipeline().parameters.foldername,'/',pipeline().parameters.filename)
Pass this as Notebook base parameter.
Notebook Execution:
Also, I tried with @trigger().outputs.body.fileName
and @trigger().outputs.body.folderPath
and it is working fine for me.
First try with a set variable and see if you are getting the file path from trigger parameters.
If still, it is giving the same, then it's better to raise a Support ticket for your issue.
Upvotes: 3