Yash
Yash

Reputation: 530

Access databricks secrets in pyspark/python job

Databricks secrets can be accessed within notebooks using dbutils, however since dbutils is not available outside notebooks how can one access secrets in pyspark/python jobs, especially if they are run using mlflow.

I have already tried How to load databricks package dbutils in pyspark

which does not work for remote jobs or mlflow project runs.

Upvotes: 2

Views: 5462

Answers (1)

simon_dmorias
simon_dmorias

Reputation: 2473

In raw pyspark you cannot do this. However if you are developing a pyspark application specifically for Databricks then I strongly recommend you look at Databricks-connect.

This allows access to parts of dbutils including secrets from an ide. It also simplifies how you access storage so that it aligns with how the code will run in production.

https://learn.microsoft.com/en-us/azure/databricks/dev-tools/databricks-connect

Upvotes: 2

Related Questions