Kas1
Kas1

Reputation: 165

Spark set driver memory config in Databricks

I'm working on Azure databricks. My driver node and worker node specs are : 14.0 GB Memory, 4 Cores, 0.75 DBU Standard_DS3_v2.

My pyspark notebook fails with Java heap space error. I checked online and one suggestion was to increase driver memory. I'm trying to use following conf parameter in the notebook

spark.conf.get("spark.driver.memory")

To get driver memory. But my notebook cell fails with error.

java.util.NoSuchElementException: spark.driver.memory

Any idea how to check driver memory and change its value?

Upvotes: 6

Views: 17425

Answers (2)

Matan Sheffer
Matan Sheffer

Reputation: 123

This should increase the memory usage of the cluster when hit the limit. It's the same answer as above, but not as a photo

spark.executor.memory 19g
spark.driver.memory 19g

Upvotes: 0

RudyVerboven
RudyVerboven

Reputation: 1274

You can set the spark config when you setup your cluster on Databricks. When you create a cluster and expand the "Advanced Options"-menu, you can see that there is a "Spark Config" section. In this field you can set the configurations you want.

enter image description here

For more information you can always check the documentation page of Azure Databricks.

Upvotes: 13

Related Questions