Reputation: 288
I'm running spark jobs through YARN with Spark submit , after my spark job failing the job is still showing status as SUCCEED instead of FAILED. how can I return exit code as failed state from code to the YARN?
How can we send yarn different application code status from the code?
Upvotes: 7
Views: 2835
Reputation: 186
I had same issue, fixed this by changing deploy mode from client to cluster. If deploy mode is client, then state will always be FINISHED. Refer https://issues.apache.org/jira/browse/SPARK-3627
Upvotes: 0
Reputation: 2650
I tried sending a non-zero exit code within the code using
System.exit(1);
which did not work as expected and @gsamaras mentioned the same in his answer. The following work around worked for me though , using try catch.
try{
}
catch {
case e: Exception => {
log.error("Error " + e)
throw e;
} }
Upvotes: 0
Reputation: 73366
I do not think you can do that. I have experienced the same behavior with spark-1.6.2, but after analyzing the failures, I don't see any obvious way of sending a "bad" exit code from my application.
Upvotes: 0