mahendra singh
mahendra singh

Reputation: 384

Class org.apache.oozie.action.hadoop.SparkMain not found

following are all oozie files which i have been using to run job. I have created folder on hdfs /test/jar and put workflow.xml and coordinator.xml file.

Properties File

nameNode=hdfs://host:8020
jobTracker=host:8050
queueName=default
oozie.use.system.lib.path=trueoozie.coord.application.path=${nameNode}/test/jar/coordinator.xml
oozie.action.sharelib.for.spark=spark2
start=2019-05-22T07:37Z
end=2019-05-22T07:40Z
freq=*/1 * * * *
zone=UTC
user.name=oozie
oozie.action.sharelib.for.spark.exclusion=oozie/jackson
#oozie.libpath=${nameNode}/user/oozie/share/lib

Coordinator File

<coordinator-app xmlns = "uri:oozie:coordinator:0.5" name = "test" frequency = "${freq}" start = "${start}" end = "${end}" timezone = "${zone}">
   <controls>
      <timeout>1</timeout>
   </controls>
   <action>
      <workflow>
         <app-path>${nameNode}/test/jar/workflow.xml</app-path>
      </workflow>
   </action>
</coordinator-app>

Workflow file

<workflow-app name="sample-wf" xmlns="uri:oozie:workflow:0.5">
 <start to="test" />
    <action name="test">
        <spark xmlns="uri:oozie:spark-action:0.2">
            <job-tracker>${jobTracker}</job-tracker>
            <name-node>${nameNode}</name-node>
            <master>yarn</master>
            <mode>cluster</mode>
            <name>Spark Example</name>
            <class>com.spark.excel.mysql.executor.Executor</class>
            <jar>${nameNode}/test/jar/com.spark.excel.mysql-0.1.jar</jar>
            <spark-opts>--executor-memory 2G --num-executors 2</spark-opts>
        </spark>
        <ok to="end"/>
        <error to="fail"/>
    </action>
 <kill name="fail">
            <message>Workflow failed, error message [${wf:errorMessage(wf:lastErrorNode())}]</message>
 </kill>
  <end name="end" />
</workflow-app>

I have setup sharelib path also. Oozie is showing spark2 also through shareliblist and added oozie-sharelib-spark.jar file also in spark2.Ozzie job is submission and running also, but when it try to execute spark job then throughing error.

Upvotes: 1

Views: 1282

Answers (1)

Miguel Angel Alonso
Miguel Angel Alonso

Reputation: 46

I had the same error. In my case I had to add in the properties file

oozie.use.system.libpath=true

Upvotes: 0

Related Questions