Reputation: 113
can we kill one of the jobs(time consuming) of a running spark application and move on to next job ?
Let us say , there are 50 jobs in a Spark Application and one of them is taking more time (may be it requires more memory than what we have configured) , So can we kill that job and move on to next job ?
and then we can run that job(that action which triggers that job) later with higher memory configuration
If this is not possible then how to handle these conditions ?
Upvotes: 2
Views: 3464
Reputation: 1712
You can kill running job by:
kill
link and confirm.Upvotes: 2