Reihan_amn
Reihan_amn

Reputation: 2747

setting hadoop mapreduce size without mapred-site.xml

I am running a mapreduce job on server and am constantly getting this error:

Container killed on request. Exit code is 143
Container exited with a non-zero exit code 143
Container is running beyond physical memory limits. Current usage: 1.0 
GB of 1 GB physical memory used; 2.7 GB of 2.1 GB virtual memory used. 
Killing container.

Of course I have read all possible resources and I know that I need to set the config in these files: mapred-site.xml \\ yarn-site.xml

But our server doesn't let me to overwrite those properties and I want an approach to do it in terminal or in config of my hadoop program.

I was running this job with hive and I was able to overwrite these properties like this:

set HADOOP_HEAPSIZE=4096;
set mapreduce.reduce.memory.mb = 4096;
set mapreduce.map.memory.mb = 4096;
set tez.am.resource.memory.mb=4096;
set yarn.app.mapreduce.am.resource.mb=4096;

But when I am writing a map reduce program instead of hive query, how can I change these?

How can I export mapreduce.reduce.memory.mb in shell for example?

Upvotes: 1

Views: 1914

Answers (2)

Reihan_amn
Reihan_amn

Reputation: 2747

I ended up solving this problem by setting the memory size in my mapreduce script as this:

conf.set("mapreduce.map.memory.mb", "4096")
conf.set("mapreduce.reduce.memory.mb", "8192")
conf.set("mapred.child.java.opts", "-Xmx512m")
conf.set("tez.am.resource.memory.mb", "4096")
conf.set("yarn.app.mapreduce.am.resource.mb", "4096")

Be careful of the number you assign to reducer. My mapper memory and reducer memory had the same capacity and I got some error. It shouldn't be the same!

Upvotes: 1

Paul
Paul

Reputation: 1194

You may need to specify like this in order to add the configuration parameter set for each application/Job

export HADOOP_HEAPSIZE=4096
hadoop jar YOUR_JAR.jar ClassName -Dmapreduce.reduce.memory.mb=4096 -Dmapreduce.map.memory.mb=4096 -Dyarn.app.mapreduce.am.resource.mb=4096 /input /output

Note: Replace YOUR_JAR.jar with your Jar and ClassName with your Driver class name

Upvotes: 3

Related Questions