Reputation: 3599
As per My spark cluster the below configuration is set
spark.executor.memory=2g
I would like to know that this 2G of RAM is shared by all executors or this 2G of RAM is used by each executor in each worker machine?
Upvotes: 2
Views: 265
Reputation: 149538
I would like to know that this 2G of RAM is shared by all executors or this 2G of RAM is used by each executor in each worker machine
This setting will cause each executor on every one of your Worker nodes to have 2G memory. This setting doesn't mean "share 2G of memory between all executors", it means "give each executor 2G of memory".
This is explicitly stated in the documentation (emphasis mine):
spark.executor.memory | 1g | Amount of memory to use per executor process (e.g. 2g, 8g).
If you have multiple executors per Worker node, this means that each one of these executors will consume 2G of memory.
Upvotes: 2