timbram
timbram

Reputation: 1865

Apache Spark: Limit number of executors used by Spark App

Is it possible to limit the number of executors used by an App on Spark? I am able to set the initial amount of executors with spark.executor.instances but later on in my app it seems that Spark will on its own add all available executors it can.

This makes it to where no other jobs can run on the cluster at the same time. Googling has led me to know solution but spark.cores.max which doesn't seem to work to limit the total number of executors...

Upvotes: 1

Views: 4012

Answers (1)

loneStar
loneStar

Reputation: 4010

spark.dynamicAllocation.maxExecutors=infinity

This property is infinity by default, you can set this property to limit the number of executors.

Upvotes: 3

Related Questions