Shailaja Koppishetty
Shailaja Koppishetty

Reputation: 61

How to kill Spark job with job name on standalone cluster

How can I kill a Spark job with a job name on standalone cluster? How to list Spark jobs ids on sandbox? Is there any command similar to yarn application -list?

Upvotes: 0

Views: 262

Answers (1)

Shailaja Koppishetty
Shailaja Koppishetty

Reputation: 61

variable=$1
jar=$2
ps -ef | grep -w ${variable} | grep -w 'org.apache.spark.deploy.SparkSubmit' | grep -w ${jar}>t.txt
 sed -n 1p t.txt>t1.txt
 awk '{print $3}' t1.txt >kill.txt
 while read x;do
  kill -9 $x
  echo Process Id $x and Application "$variable" killed
done <kill.txt

Upvotes: 1

Related Questions