progmanos
progmanos

Reputation: 59

YARN AM Container Error When Running Hive Hook with Thrift

I am a building a tool that requires that I run MapReduce jobs in a Hive SemanticAnalyzer. When I run my custom java program that processes launches the Hive Driver, the MapReduce jobs work well. However, when I try to execute the Hook by modifying HiveServer2 and the Hive JDBC (which connects fine to my modified server), the MapReduce jobs fail with the following error:

Application application_1395851979242_0009 failed 2 times due to AM Container for appattempt_1395851979242_0009_000002 exited with exitCode: 1 due to: Exception from container-launch:
org.apache.hadoop.util.Shell$ExitCodeException:
        at org.apache.hadoop.util.Shell.runCommand(Shell.java:464)
        at org.apache.hadoop.util.Shell.run(Shell.java:379)
        at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:589)
        at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:195)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:283)
        at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:79)
        at java.util.concurrent.FutureTask$Sync.innerRun(FutureTask.java:303)
        at java.util.concurrent.FutureTask.run(FutureTask.java:138)
        at java.util.concurrent.ThreadPoolExecutor$Worker.runTask(ThreadPoolExecutor.java:886)
        at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:908)
        at java.lang.Thread.run(Thread.java:662)

Has anyone else encountered a similar error with YARN?

Upvotes: 0

Views: 1055

Answers (1)

progmanos
progmanos

Reputation: 59

I used the command:
yarn logs --applicationId myAppId
to retrieve the application container logs.

After that, I found that I was missing some jars. Because of the nature of the application I am working on (MR Jobs launched from a Hive Hook from Thrift), I had to edit the MapReduce Job configuration to include the jars in the "tmpjars" setting.

Upvotes: 2

Related Questions