Reputation: 67
I have a Azure data factory pipeline that is calling a Databricks notebook. I have parameterized the pipeline and via this pipeline I am passing the product name to the databricks notebook.
Based on the parameter the Databricks will push the processed data into the specific ADLS directory. Now the problem is- How do I make my pipeline aware that which parameter need to pass to the Databricks.
Example: If I pass the Nike via the adf to the databricks then my data would get pushed into Nike directory or If I pass Adidas then data would get pushed into Adidas directory.
Please note that I am triggering the ADF from the automation account.
Upvotes: 0
Views: 745
Reputation: 31
As I understood, you are using product_name = dbutils.widgets.get('product_name') statement in the databricks notebook to get the param and based on that param you process the data. The question is how to pass different params to the notebook? You create one adf pipeline and you can pass different params to the triggers that execute the adf pipeline. Create ADF pipeline Adf pipeline create trigger that will pass params to the ADF pipeline triggers This way you will have 1 ADF pipeline with multiple instances of it with different params like Adidas, Nike etc.
Upvotes: 0