JAT
JAT

Reputation: 79

Create Scope in Databrick API 2.0 - INVALID_PARAMETER_VALUE

I have this problem, when I am creating a scope!

How is it possible to create an AAD Token and integrate it into the script?

enter image description here

enter image description here

Upvotes: 1

Views: 1628

Answers (2)

Alex Ott
Alex Ott

Reputation: 87279

It's possible to create secret scope baked by Azure KeyVault, but you need to have AD token not the Databricks token. This is possible to do via Databricks CLI and Databricks Terraform provider, and both of them are using REST API to do it, but as mentioned at beginning - you need to use AAD token that you can get via Azure CLI.

My personal preference is to use Databricks Terraform provider that is very flexible, and easier to use than REST API and Databricks CLI.

Upvotes: 1

Leo Liu
Leo Liu

Reputation: 76928

Create Scope in Databrick API 2.0 - INVALID_PARAMETER_VALUE

This is a known issue with the databricks api and that powershell module:

Scope with Azure KeyVault must have userAADToken defined

Databricks are changing the API and will not commit to the final state until Key Vault backed scopes comes out of Preview. I've no timescales yet. In the meantime if you need these I would deploy them manually - the CLI or REST API do not support them yet.

So, AFAIK, we have no way to create an Azure Key Vault-backed scopes with REST API at this moment. We just can create it in the Azure Databricks UI. In other words, If we provide key vault resource id when we call the REST API or CLI, the api cannot be processed by the backend server.

Upvotes: 1

Related Questions