MapReddy Usthili
MapReddy Usthili

Reputation: 288

Spark yarn return exit code not updating as failed in webUI - spark submit

I'm running spark jobs through YARN with Spark submit , after my spark job failing the job is still showing status as SUCCEED instead of FAILED. how can I return exit code as failed state from code to the YARN?

How can we send yarn different application code status from the code?

Upvotes: 7

Views: 2835

Answers (3)

user1020455
user1020455

Reputation: 186

I had same issue, fixed this by changing deploy mode from client to cluster. If deploy mode is client, then state will always be FINISHED. Refer https://issues.apache.org/jira/browse/SPARK-3627

Upvotes: 0

K S Nidhin
K S Nidhin

Reputation: 2650

I tried sending a non-zero exit code within the code using

System.exit(1);

which did not work as expected and @gsamaras mentioned the same in his answer. The following work around worked for me though , using try catch.

try{
}
catch {
      case e: Exception => {
      log.error("Error  " + e)
      throw e;
      }  } 

Upvotes: 0

gsamaras
gsamaras

Reputation: 73366

I do not think you can do that. I have experienced the same behavior with , but after analyzing the failures, I don't see any obvious way of sending a "bad" exit code from my application.

Upvotes: 0

Related Questions