Reputation: 3
I have come to learn that spark.storage.memoryFraction and spark.storage.safteyFraction are multiplied by the executor memory supplied in the sparkcontext. Also, I have learned that it is desirable to lower the memoryFraction for better performance.
The question is where do I set the spark.storage.memoryFraction? Is there a config file?
Upvotes: 0
Views: 658
Reputation: 3890
I recommend you to keep it on per job basis instead of updatring spark-defaults.conf you can create a config file per job, say spark.properties and pass it in spark-submit
--properties-file /spark.properties
Upvotes: 0
Reputation: 6693
The default file that Spark search for such configurations is conf/spark-defaults.conf
If you want to change dir conf
to a customized position, set SPARK_CONF_DIR
in conf/spark-env.sh
Upvotes: 1