MetallicPriest
MetallicPriest

Reputation: 30805

Can I change Spark's executor memory at runtime?

Is it possible to change the value of executor memory at runtime in Spark? The reason I want to do it is that for some map tasks I want the yarn scheduler to put each task on a separate node. By increasing the executor memory to near the total memory of a node, I ensure they are distributed on each node. Later on, I want to run several tasks per node, so I would lower the executor memory for them.

Upvotes: 6

Views: 1314

Answers (1)

red1ynx
red1ynx

Reputation: 3775

No, you can't.

Each executor starts on their own JVM, and you can't change JVM memory at runtime. Please see for reference: Setting JVM heap size at runtime

Upvotes: 5

Related Questions