Reputation: 3435
I am running the spark-shell locally passing 2G as driver memory:
alex@POSITRON /ssd2/spark-2.2.0-bin-hadoop2.7/bin $ bash spark-shell --master local --driver-memory 2G
After it is up an running I go to the spark UI to the "Environment" tab and see there that my setting is in effect:
Then I go to the "Executors" tab and there it shows me that only 956MB seems to be an effective setting:
Could you clarify where this 956MB value comes from because I feel I do not understand either config options or UI figures correctly?
Upvotes: 2
Views: 2221
Reputation: 27373
What you see in Spark UI is the Memory available for storage, which is a fraction of the total memory (spark.memory.storageFraction
, default=0.5) see https://spark.apache.org/docs/latest/configuration.html#memory-management
Upvotes: 5