Gkan
Gkan

Reputation: 385

How to stop running Spark application?

I wrote a few Spark job in Java then submitted the jars with submit script.

bin/spark-submit --class "com.company.spark.jobName.SparkMain" --master local[*] /tmp/spark-job-1.0.jar

There will be a service and will run in same server. The service should stop the job when receive the stop command.

I have these information about job in service:

Is there any way to stop running spark job in java code.

Upvotes: 2

Views: 1179

Answers (1)

Justin Pihony
Justin Pihony

Reputation: 67065

Have you reviewed the REST server and the ability to use /submissions/kill/[submissionId]? That seems like it would work for your need.

Upvotes: 2

Related Questions