Surender Raja
Surender Raja

Reputation: 3609

Killing Spark job using command Prompt

What is the command to kill spark job from terminal. I don't want to kill a running spark job via spark UI

Upvotes: 8

Views: 20782

Answers (4)

koiralo
koiralo

Reputation: 23119

If you are running on yarn use

yarn application -kill applicationID

Get application id from WEB UI or list with yarn application -list

./bin/spark-class org.apache.spark.deploy.Client kill <master url> <driver ID>

or you can look the spark-submit id by the command jps and kill the process but this is not the suggested way

Upvotes: 9

Arun Goudar
Arun Goudar

Reputation: 381

With below command you will get all running jobs in cluster. Use grep keyword to get exact job and kill the job using that application id. simple and fast.

yarn application -appStates RUNNING -list | grep "applicationName"

yarn application -kill <applicationId>

Upvotes: 4

BadBoy777
BadBoy777

Reputation: 155

To see the list of applications that are running

yarn application -list 

to kill

yarn application -kill appid

Upvotes: 9

Vignesh I
Vignesh I

Reputation: 2221

If you are using yarn then just use below command in the terminal

yarn application -kill application_id

Upvotes: 4

Related Questions