saheb bhattu
saheb bhattu

Reputation: 113

how to kill spark job of a spark application?

can we kill one of the jobs(time consuming) of a running spark application and move on to next job ?

Let us say , there are 50 jobs in a Spark Application and one of them is taking more time (may be it requires more memory than what we have configured) , So can we kill that job and move on to next job ?

and then we can run that job(that action which triggers that job) later with higher memory configuration

If this is not possible then how to handle these conditions ?

Upvotes: 2

Views: 3464

Answers (1)

user7337271
user7337271

Reputation: 1712

You can kill running job by:

  • opening Spark application UI.
  • going to jobs tab.
  • find job among running jobs.
  • click on kill link and confirm.

Upvotes: 2

Related Questions