ablimit
ablimit

Reputation: 2361

How to set maximum number of reducers per node in Hadoop streaming?

I have a C++ based MapReduce jobs and I'm using Hadoop streaming.

However the maximum number of reducers per node end up being 7 even if I set them to 18 in the command line configuration as mapred.tasktracker.reduce.tasks.maximum=18 ;

Is anything else stopping the reducer from emitting more reduce tasks ?

Upvotes: 1

Views: 2111

Answers (1)

Chris White
Chris White

Reputation: 30089

After amending the mapred.tasktracker.reduce.tasks.maximum property, are you restarting the task trackers in your cluster? You should be able to go to the Job Tracker web ui page and confirm that each task tracker now has 18 reducer slots configured

Upvotes: 1

Related Questions