Shweta
Shweta

Reputation: 121

Error in Hadoop MapReduce

When I run a mapreduce program using Hadoop, I get the following error.

10/01/18 10:52:48 INFO mapred.JobClient: Task Id : attempt_201001181020_0002_m_000014_0, Status : FAILED
  java.io.IOException: Task process exit with nonzero status of 1.
    at org.apache.hadoop.mapred.TaskRunner.run(TaskRunner.java:418)
10/01/18 10:52:48 WARN mapred.JobClient: Error reading task outputhttp://ubuntu.ubuntu-domain:50060/tasklog?plaintext=true&taskid=attempt_201001181020_0002_m_000014_0&filter=stdout
10/01/18 10:52:48 WARN mapred.JobClient: Error reading task outputhttp://ubuntu.ubuntu-domain:50060/tasklog?plaintext=true&taskid=attempt_201001181020_0002_m_000014_0&filter=stderr

What is this error about?

Upvotes: 10

Views: 12312

Answers (5)

user1989252
user1989252

Reputation: 79

Increase your ulimit to unlimited. or alternate solution reduce the allocated memory.

Upvotes: 2

Binary Nerd
Binary Nerd

Reputation: 13902

One reason Hadoop produces this error is when the directory containing the log files becomes too full. This is a limit of the Ext3 Filesystem which only allows a maximum of 32000 links per inode.

Check how full your logs directory is in hadoop/userlogs

A simple test for this problem is to just try and create a directory from the command-line for example: $ mkdir hadoop/userlogs/testdir

If you have too many directories in userlogs the OS should fail to create the directory and report there are too many.

Upvotes: 14

If you create a runnable jar file in eclipse, it gives that error on hadoop system. You should extract runnable part. That solved my problem.

Upvotes: 0

Thamizh
Thamizh

Reputation: 83

Another cause can be, JVM Error when you try to allocate some dedicated space to JVM and it is not present on your machine.

sample code:
conf.set("mapred.child.java.opts", "-Xmx4096m");

Error message:
Error occurred during initialization of VM
Could not reserve enough space for object heap

Solution: Replace -Xmx with dedicated memory value that you can provide to JVM on your machine(e.g. "-Xmx1024m")

Upvotes: 2

wlk
wlk

Reputation: 5785

I was having the same issue when I run out of space on disk with log directory.

Upvotes: 2

Related Questions