user9332151
user9332151

Reputation: 57

Required executor memory is above the max threshold of this cluster

I am running Spark on an 8 node cluster with yarn as a resource manager. I have 64GB memory per node, and I set the executor memory to 25GB, but I get the error: Required executor memory (25600MB) is above the max threshold (16500 MB) of this cluster! Please check the values of 'yarn.scheduler.maximum-allocation-mb' and/or 'yarn.nodemanager.resource.memory-mb'. I the set yarn.scheduler.maximum-allocation-mb and yarn.nodemanager.resource.memory-mb to 25600 but nothing changes. enter image description here

Upvotes: 3

Views: 11261

Answers (1)

tk421
tk421

Reputation: 5947

Executor memory is only the heap portion of the memory. You still have to run a JVM plus allocate the non-heap portion of memory inside a container and have that fit in YARN. Refer to the image from How-to: Tune Your Apache Spark Jobs (Part 2) by Sandy Ryza. enter image description here

If you want to use executor memory set at 25GB, I suggest you bump up yarn.scheduler.maximum-allocation-mb and yarn.nodemanager.resource.memory-mb to something higher like 42GB.

Upvotes: 2

Related Questions