HHH
HHH

Reputation: 6465

How to use Yarn to allocate more resources to a job

I have a few different jobs to run on a Hadoop cluster. Some need few resources and some more, e.g. memory. I'd like to run these jobs simultaneously on my cluster as it supports Yarn. I think if I just submit the jobs to the cluster, Yarn automatically decides on the resource requirements, however I'd like to specify it myself. How can I use the api or the command line to specify each job resource requirements?

Upvotes: 0

Views: 803

Answers (1)

Karthik
Karthik

Reputation: 1811

You can set memory for mapper and reducer using JobConf. You can do it from command line or in your Driver class.

Look for these properties in specific setMemoryForMapTask(long mem) and setMemoryForReduceTask(long mem)

https://hadoop.apache.org/docs/current/api/org/apache/hadoop/mapred/JobConf.html#setMemoryForMapTask(long) has more information and usage details.

Upvotes: 1

Related Questions