Amit Kumar
Amit Kumar

Reputation: 905

how to know remotely if spark job is running on cluster

I am running spark job on ec2 cluster, I have a trigger that submits job periodically. I do not want to submit job if one job is already running on cluster. Is there any api that can give me this information?

Upvotes: 1

Views: 420

Answers (2)

Ruofei Shen
Ruofei Shen

Reputation: 55

you can consult the UI to see the status eg. If you run locally, take a look at the localhost:4040

Upvotes: 0

maasg
maasg

Reputation: 37435

Spark, and by extension, Spark Streaming offer an operational REST API at http://<host>:4040/api/v1

Consulting the status of the current application will give you the information sought.

Check the documentation: https://spark.apache.org/docs/2.1.0/monitoring.html#rest-api

Upvotes: 1

Related Questions