Reputation: 760
If I use spark-submit to submit two spark applications, the 2nd application always fail with error like below.
Caused by: java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)
They all runs fine if I submit them one by one, i.e., wait for the previous application to finish, then submit another one.
What am I missing here? Our cluster uses standalone mode.
Upvotes: 0
Views: 205
Reputation: 760
This issue is gone after removing explicitly initializing a SparkContext within the Spark applications. The underlying mechanism still remains a mystery.
Upvotes: 0
Reputation: 55
Let me know what type of Spark Job you are submitting. If possible add code snippet of your failing Spark application.
Upvotes: 0
Reputation: 85
You should check a few things -
Instead you should use:
sparksession.streams().awaitAnyTermination();
Check all running, failed and succeeded jobs on Spark-UI with its log.
Upvotes: 1