Shawn.X
Shawn.X

Reputation: 1363

How can I just kill the stuck job but not kill the application in Spark?

enter image description here

As above, I just want to kill the stage 1244, not kill the application.
if I click the kill button, is it kill the whole application ?
and how can I do to kill just the job in the application that I want to kill ?

Upvotes: 0

Views: 590

Answers (1)

pltc
pltc

Reputation: 6082

The kill button that you highlighted will kill the current job. However:

  • If this is an interactive Spark session (i.e running Jupyter notebook or spark-shell or pyspark) then the application will still alive.
  • If this is an non-interactive Spark session (i.e spark-submit) then the application will get killed together with the job, because the application status is considered as failed.

Upvotes: 1

Related Questions