Reputation: 3609
What is the command to kill spark job from terminal. I don't want to kill a running spark job via spark UI
Upvotes: 8
Views: 20782
Reputation: 23119
If you are running on yarn use
yarn application -kill applicationID
Get application id from WEB UI or list with yarn application -list
./bin/spark-class org.apache.spark.deploy.Client kill <master url> <driver ID>
or you can look the spark-submit
id by the command jps
and kill the process but this is not the suggested way
Upvotes: 9
Reputation: 381
With below command you will get all running jobs in cluster. Use grep keyword to get exact job and kill the job using that application id. simple and fast.
yarn application -appStates RUNNING -list | grep "applicationName"
yarn application -kill <applicationId>
Upvotes: 4
Reputation: 155
To see the list of applications that are running
yarn application -list
to kill
yarn application -kill appid
Upvotes: 9
Reputation: 2221
If you are using yarn then just use below command in the terminal
yarn application -kill application_id
Upvotes: 4