roh
roh

Reputation: 133

How to set Spark application exit status?

I'm writing a spark application and run it using spark-submit shell script (using yarn-cluster/yarn-client)

As I see now, exit code of spark-submit is decided according to the related yarn application - if SUCCEEDED status is 0, otherwise 1.

I want to have the option to return another exit code - for a state that my application succeeded with some errors.

Is it possible? to return different exit code from the application?

I tried to use System.exit() but didn't succeed...

Thanks.

Upvotes: 7

Views: 12320

Answers (2)

Volodymyr Zubariev
Volodymyr Zubariev

Reputation: 443

If you run in cluster mode, spark-submit ends immediately returning submission ID as part of json, and do not waits for the application status. After this you can query the status by

 spark-submit --status [submission ID] 

If run in local or standalone modes you should be able to get this the exit code from spark-submit process.

Upvotes: 0

Darshan
Darshan

Reputation: 2333

It is possible in client mode but not in cluster mode. You have a workaround for cluster mode.

My answer to this question should help you.

Upvotes: 0

Related Questions