Reputation: 1
I want to run azure databricks notebook from a python file I have client_id, secret and token id. I tried to run it by creating a databricks client but there is no package that can run a databricks notebook it seems. TIA for any suggestions
The answer should run a data bricks notebook like we run a datafactory like this code does
adf_client = DataFactoryManagementClient(credentials, subscription_id)
run_response = adf_client.pipelines.create_run(rg_name, df_name, df_pipeline_name, parameters=...............
pipeline_run = adf_client.pipeline_runs.get(rg_name, df_name, run_response.run_id)
status = pipeline_run.status
while status == 'Queued' or status == 'InProgress':
time.sleep(5)
status = adf_client.pipeline_runs.get(rg_name, df_name, run_response.run_id).status
Upvotes: 0
Views: 541
Reputation: 2662
You can use the databricks rest APIs to trigger databricks jobs. You have to first configure a job with a cluster and a notebook.
You can check this blog which demonstrates this. The blog talks about calling the APIs via postman. You just have to replace this with python code.
The official databricks rest API documentation to trigger a job can be found here. Databricks documentation also shows how to call the APIs using python code.
You can use a databricks token or AAD bearer token for authorization.
Upvotes: 1