Borislav Blagoev
Borislav Blagoev

Reputation: 197

How can I get a job name while the job running in Databricks. It is not notebook based job

I'm trying to get the name of the running job. I want to get the name and to send messages. Example: I deploy my job to databricks and run it. And I want this job to send message when start in slack with job name and that's why I want to get the name of the current job.

Upvotes: 4

Views: 4784

Answers (1)

Alex Ott
Alex Ott

Reputation: 87174

Databricks exposes a lot of information via spark.conf - the configuration properties are starting with spark.databricks.clusterUsageTags., so you can filter all configurations and search for necessary information.

For jobs (and interactive clusters as well, but slightly different), there is a configuration property spark.databricks.clusterUsageTags.clusterAllTags that represents a JSON string with a list of dictionaries each consisting of key and value fields. For jobs you should have the RunName key for job name, and JobId for the job ID. You can convert it with something like this:

import json
all_tags = {}
for tag in json.loads(spark.conf.get(
      "spark.databricks.clusterUsageTags.clusterAllTags")):
  all_tags[tag['key']] = tag['value']
job_name = all_tags.get('RunName')

Upvotes: 6

Related Questions