Reputation: 89
I want to change the initial/minimum heap size of my executors while running spark on yarn. Right now it throws the following exception,
java.lang.Exception: spark.executor.extraJavaOptions is not allowed to alter memory settings
I am using the following --conf "spark.executor.extraJavaOptions=-Xms4096m"
while running my spark-shell.
I am using spark 1.6.0. Greatly appreciate the help!
Upvotes: 1
Views: 5459
Reputation: 11593
A bit about spark.executor.extraJavaOptions
from the docs
Note that it is illegal to set Spark properties or heap size settings with this option. Spark properties should be set using a SparkConf object or the spark-defaults.conf file used with the spark-submit script. Heap size settings can be set with spark.executor.memory.
Try this --conf "spark.executor.memory=4g"
Upvotes: 3