nam
nam

Reputation: 23868

Azure Databricks cluster spark configuration is disabled

When creating an Azure Databricks and configuring its cluster, I had chosen the default languages for Spark to be python,sql. But now I want to add Scala, as well. When running the Scala script I was getting the following error. So, my online search took me to this article that describes that you can change Cluster configuration by going to the Advanced options section of the cluster settings page and clicking on the Spark tab there (as shown in image below). But I find the Spark section there greyed out (disabled):

Question: How can I enabled the Spark section of the Advanced section of the cluster settings page (shown in image below) so I can edit the last line of the section? Note: I created the Databricks and its cluster and hence I am the admin (as shown in image 2 below).

Databricks Notebook error: Your administrator has only allowed sql and python commands on this cluster.

enter image description here

enter image description here

Upvotes: 2

Views: 7000

Answers (1)

Alex Ott
Alex Ott

Reputation: 87299

You need to click "Edit" button in the cluster controls - after that you should be able to change Spark configuration. But you can't enable Scala for the High concurrency clusters with credentials passthrough as it supports only Python & SQL (doc) - primary reason for that is that with Scala you can bypass users isolation.

If you need credentials passthrough + Scala, then you need to use Standard cluster, but it will work only with a single specific user (doc).

Upvotes: 3

Related Questions