Reputation: 37
I have a databricks notebook with a SQL statement, something like below
I need this notebook to be run by powerautomate flow and return the notebook output.
I followed the instruction as mentioned in here, it is working fine and the job runs in databricks. However, I am not able to fetch the notebook output back to the flow. Do I have to write the output in databricks notebook to fetch it?
Any help or links would be appreciated.
Upvotes: 0
Views: 1349
Reputation: 11529
AFAIK we cannot return values from databricks notebook using SQL code. To do that, we need to use dbutils.notebook.exit()
in the notebook.
So, change the last cell of your SQL notebook to python cell and use it.
For Sample I have returned like below.
%python
dbutils.notebook.exit(spark.sql("select split('Rakesh Govindula Laddu Virat',' ') as value,'Hi laddu' as Greetings;").collect())
You can return anything as string as per your requirement. Now, call it using logic app HTTP post. But this action only triggers the notebook execution but won't get the output from it.
For the output of the Notebook, we need to use the below API (GET) using run id of the executed job in the logic app.
https://adb-Xxxxxxxxxxxxxxxxx.azuredatabricks.net/api/2.1/jobs/runs/get-output?run_id=<job run id>
After HTTP post of the logic app, use another HTTP action with GET and provide the run_id
that we got from the HTTP post
action.
https://adb-Xxxxxxxxxxxxxxxxx.azuredatabricks.net/api/2.1/jobs/runs/get-output?run_id=@{body('HTTP')?['run_id']}
Here, I have used Delay
to wait till the job completed and given 1 minute time as per my job run execution. Change the time as per your job run execution time. (OR) you can use Do Until loop method as mentioned in this blog by @sharepointed
Execution:
Output:
Upvotes: 1