Luke
Luke

Reputation: 760

Error when submitting multiple spark applications to standalone cluster

If I use spark-submit to submit two spark applications, the 2nd application always fail with error like below.

Caused by: java.lang.IllegalStateException: Cannot call methods on a stopped SparkContext.
This stopped SparkContext was created at:
org.apache.spark.api.java.JavaSparkContext.<init>(JavaSparkContext.scala:58)

They all runs fine if I submit them one by one, i.e., wait for the previous application to finish, then submit another one.

What am I missing here? Our cluster uses standalone mode.

Upvotes: 0

Views: 205

Answers (3)

Luke
Luke

Reputation: 760

This issue is gone after removing explicitly initializing a SparkContext within the Spark applications. The underlying mechanism still remains a mystery.

Upvotes: 0

Deepraj B.
Deepraj B.

Reputation: 55

Let me know what type of Spark Job you are submitting. If possible add code snippet of your failing Spark application.

Upvotes: 0

Ketan Kumbhar
Ketan Kumbhar

Reputation: 85

You should check a few things -

  1. Check if you have used stop() keyword on sparkcontext.
  2. Instead you should use:

    sparksession.streams().awaitAnyTermination();

  3. Check all running, failed and succeeded jobs on Spark-UI with its log.

Upvotes: 1

Related Questions