Reputation: 21
I have a cluster with Cloudera 5.10. For profiling I'm running spark-submit with parameters:
--conf "spark.driver.extraJavaOptions= -agentpath:/root/yjp-2017.02/bin/linux-x86-64/libyjpagent.so=sampling"
--conf "spark.executor.extraJavaOptions= -agentpath:/root/yjp-2017.02/bin/linux-x86-64/libyjpagent.so=sampling"
And it is working good only for driver. When i'm using this option for executor i'm getting the exception
Stack trace: ExitCodeException exitCode=1:
at org.apache.hadoop.util.Shell.runCommand(Shell.java:601)
at org.apache.hadoop.util.Shell.run(Shell.java:504)
at org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:786)
at org.apache.hadoop.yarn.server.nodemanager.DefaultContainerExecutor.launchContainer(DefaultContainerExecutor.java:213)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:302)
at org.apache.hadoop.yarn.server.nodemanager.containermanager.launcher.ContainerLaunch.call(ContainerLaunch.java:82)
at java.util.concurrent.FutureTask.run(FutureTask.java:266)
at java.util.concurrent.ThreadPoolExecutor.runWorker(ThreadPoolExecutor.java:1142)
at java.util.concurrent.ThreadPoolExecutor$Worker.run(ThreadPoolExecutor.java:617)
at java.lang.Thread.run(Thread.java:748)
I couldn't find any useful logs and the same exception I've got on every node. The same if I'am using this manual enter link description here
And if I leave only configuration for driver, everything is working fine and i can use YourKit to connect to the driver What can be the problem?
Upvotes: 1
Views: 860
Reputation: 399
I experianced the same issue. You have to install YourKit on all nodes in the cluster.
Upvotes: 0
Reputation: 725
May be an executor launches 32-bit JVM? So path to 32-bit YourKit agent should be specified?
Upvotes: 1