Reputation: 23753
Is there an api or other way to programmatically run a Databricks job. Ideally, we would like to call a Databricks job from a notebook. Following just gives currently running job id but that's not very useful:
dbutils.notebook.entry_point.getDbutils().notebook().getContext().currentRunId().toString()
Upvotes: 1
Views: 3353
Reputation: 6104
To run a databricks job, you can use Jobs API
. I have a databricks job called for_repro
which I ran using the 2 ways provided below from databricks notebook.
Using requests
library:
Settings -> User settings
. Under Access token
tab, click generate token.import requests
import json
my_json = {"job_id": <your_job-id>}
auth = {"Authorization": "Bearer <your_access-token>"}
response = requests.post('https://<databricks-instance>/api/2.0/jobs/run-now', json = my_json, headers=auth).json()
print(response)
<databricks-instance>
value from the above code can be extracted from your workspace URL.
Using %sh magic command script:
%sh
curl --netrc --request POST --header "Authorization: Bearer <access_token>" \
https://<databricks-instance>/api/2.0/jobs/run-now \
--data '{"job_id": <your job id>}'
Refer to this Microsoft documentation to know all other operations that can be achieved using Jobs API
.
Upvotes: 3