Abhishek Anand
Abhishek Anand

Reputation: 1992

Mark Spark Job as Failed in Yarn UI

The application is part of complex eco-system, where we are tracking status of all jobs using Yarn Rest Api.

Now for some specific business scenario we need to mark Spark Job as failed, but I have landed in a Gotcha situation, because doesn't matter what I raise in spark job Error/Exception or System.exit(123) job gets marked as Finished in Yarn, with finalstatus as Succeeded.

Using spark-submit to fire the spark job using jar.

object Execute {
 def main(args: Array[String]) {
   val sc = new SparkContext(sparkConf)
   if(businessException needs to be raised)
      //What to do???
 }
}

Things I have tried in spark job:

Hopefully someone can tell me how do I mark spark job as failed in yarn UI.

Upvotes: 2

Views: 1048

Answers (1)

Abhishek Anand
Abhishek Anand

Reputation: 1992

Never mind this. Yarn's reporting of spark application is unstable anyways, as evident from the multiple bugs on Jira, indicating Yarn sometimes marks Succeeded spark jobs as Failed, and vice versa.

I ended up making own db table to keep track of resulting final status (error, success etc), which is updated from spark job, depending on condition.

Upvotes: 0

Related Questions