Charles Upjohn
Charles Upjohn

Reputation: 3

Spark - config file that sets spark.storage.memoryFraction

I have come to learn that spark.storage.memoryFraction and spark.storage.safteyFraction are multiplied by the executor memory supplied in the sparkcontext. Also, I have learned that it is desirable to lower the memoryFraction for better performance.

The question is where do I set the spark.storage.memoryFraction? Is there a config file?

Upvotes: 0

Views: 658

Answers (2)

banjara
banjara

Reputation: 3890

I recommend you to keep it on per job basis instead of updatring spark-defaults.conf you can create a config file per job, say spark.properties and pass it in spark-submit

--properties-file /spark.properties

Upvotes: 0

yjshen
yjshen

Reputation: 6693

The default file that Spark search for such configurations is conf/spark-defaults.conf

If you want to change dir conf to a customized position, set SPARK_CONF_DIR in conf/spark-env.sh

Upvotes: 1

Related Questions