Waqar Ahmed
Waqar Ahmed

Reputation: 5068

Multiple executors for spark applcaition

Can one worker have multiple executors for the same Spark application in standalone and yarn mode? If no, then what is the reason for that (for both standalone and yarn mode).

Upvotes: 1

Views: 891

Answers (1)

Yehor Krivokon
Yehor Krivokon

Reputation: 877

Yes, you can specify resources which Spark will use. For example, you can use these properties for configuration:

--num-executors 3   
--driver-memory 4g   
--executor-memory 2g   
--executor-cores 2   

If your node has enough resources cluster assigns more than one executors to the same node.

You can read more information about Spark resources configuration here.

Upvotes: 2

Related Questions