Reputation: 410
I have an oozie workflow that has the following format:
<workflow-app xmlns="uri:oozie:workflow:0.5" name="${componente}">
...
<start to="S000_Guida_rilevazioni_annuali"/>
<action name="action name 1" cred="hcat,hs2-creds">
<spark xmlns="uri:oozie:spark-action:0.2">
<job-tracker>${jobTracker}</job-tracker>
<name-node>${nameNode}</name-node>
<master>${master}</master>
<mode>cluster</mode>
<name>class 1 name</name>
<class>com.sample.project</class>
<jar>${wf_path}/jar_file.jar</jar>
<spark-opts>--queue ${queueName} --num-executors 2 --executor-cores 2 --executor-memory 2G --jars ${hiveWarehouseConnectorLib}</spark-opts>
</spark>
<ok to="action name 2" />
<error to="killJobAction"/>
</action>
...
This workflow has a decision implemented that if goes into error, then it runs an action called killJobAction.
Now what I want to implement is to change the code flow based on value of a variable inside the scala class. Let say a boolean variable call varWF=true/false. Is it possible?
Upvotes: 0
Views: 163
Reputation: 41
You can wrap your spark job in shell action and use spark submit from the shell script. How to catch oozie spark output
And use this capture in Oozie decision Node https://oozie.apache.org/docs/3.2.0-incubating/WorkflowFunctionalSpec.html#a3.1.4_Decision_Control_Node
for example
<workflow-app name="foo-wf" xmlns="uri:oozie:workflow:0.1">
...
<decision name="mydecision">
<switch>
<case to="next-action1">
${wf:actionData('shell_action_name')['variable_name'] eq 'true'}
</case>
<default to="next-action2"/>
</switch>
</decision>
...
</workflow-app>
Upvotes: 1