Saurabh Sharma
Saurabh Sharma

Reputation: 325

Spark standalone cluster behavior Query

we have two unix machines with linux with 20 cores each.We need to setup a standalone cluster with spark scheduler. At this time we cannot have cloudera/Hortonwork for the time being. My query is regarding Spark Scheduler.

If i create one unix machine as both Master and Slave and second machine as slave node, how many cores will be available for me to run the spark code. Will it be 40 or less? Can i run multiple jobs on this cluster on cluster mode?

Upvotes: 0

Views: 51

Answers (1)

Sim
Sim

Reputation: 13528

You need one core your driver so the maximum number of cores you can have for executors is 39.

How your cluster will run jobs depends on how you configure resource scheduling. Currently, only FIFO job scheduling within a single application is supported in standalone mode but you can allocate variable number of resources to different applications.

Upvotes: 1

Related Questions