vikram reddy
vikram reddy

Reputation: 1

How to configure the Yarn cluster with spark?

I have 2 machines with 32gb ram and 8core each machine. So how can i configure the yarn with spark and which properties i have to use to tune the resources according to our dataset. I have 8gb dataset, So can anyone suggest the configuration of yarn with spark in parallel jobs running?

Here is the yarn configuration: I'm using hadoop 2.7.3,spark 2.2.0 and ubuntu 16

`yarn scheduler minimum-allocation-mb--2048 
yarn scheduler maximum-allocation-mb--5120
yarn nodemanager resource.memory-mb--30720 
yarn scheduler minimum-allocation-vcores--1 
yarn scheduler maximum-allocation-vcores--6 
yarn nodemanager resource.cpu-vcores--6`

Here is the spark configuration:

spark master    master:7077 
spark yarn am memory 4g 
spark yarn am cores 4 
spark yarn am memoryOverhead    412m 
spark executor instances    3 
spark executor cores    4 
spark executor memory   4g 
spark yarn executor memoryOverhead  412m

but my question is with 32gb ram and 8core each machine. how many applications i can run whether this conf is correct? bcoz only two applications running parallely.

Upvotes: 0

Views: 250

Answers (0)

Related Questions