Vishal
Vishal

Reputation: 1492

Stop Spark batch Job Gracefully

I am working on the spark job which updates it status in the table about its exectuion, now If the table contains the successful execution already, I need to stop the spark job gracefully.

I tried doing

System.exit(0)

But the job fails with the

error :Shutdown hook called before final status was reported.

What is the correct procedure to exit the spark job gracefully.

Upvotes: 0

Views: 1769

Answers (1)

DNA
DNA

Reputation: 42586

You just need to call sc.stop() (on the SparkContext) before exiting your application.

See also similar question on pyspark.

Upvotes: 1

Related Questions