syko
syko

Reputation: 3637

What is the differences between spark.{driver,executor}.memory in spark-defaults.conf and SPARK_WORKER_MEMORY in spark-env.sh?

I am planning to perform an experiment on Spark.

There are two configuration files: spark-defaults.conf and spark-env.sh

In spark-defaults.conf, there are spark.driver.memory and spark.executor.memory.

In spark-env.sh, there are SPARK_WORKER_MEMORY.

Which one should I control to adjust the memory capacity? (I use spark-1.6.0 version in standalone mode)

Upvotes: 1

Views: 609

Answers (1)

y durga prasad
y durga prasad

Reputation: 1202

spark-default.conf,this properties file serves as the default settings file, which is used by the spark-submit script to launch applications in a cluster. The spark-submit script loads the values specified in spark-defaults.conf and passes them on to your application. Note: If you define environment variables in spark-env.sh, those values override any of the property values you set in spark-defaults.conf

depends on your configuration and file selection use "spark.executor.memory" or "SPARK_WORKER_MEMORY" "spark.driver.memory" or "SPARK_DRIVER_MEMORY"

Upvotes: 3

Related Questions