sds
sds

Reputation: 60014

How to check that the SparkContext has been stopped?

How do I detect whether the SparkContext has been stopped?

Upvotes: 12

Views: 13229

Answers (4)

ijoseph
ijoseph

Reputation: 7163

In PySpark, I just use the kludge sc._sc._jsc is None as an indicator for the context being stopped.

Bonus: if you're looking to see if a pyspark.sql.DataFrame instance, say df, is associated with a stopped spark context, you can do df._sc._jsc is None.

Upvotes: 1

Artem Kupchinskiy
Artem Kupchinskiy

Reputation: 31

If you use spark 1.5, that could be done via reflection API:

boolean isStopped(SparkContext sc) throws NoSuchFieldException, IllegalAccessException {
    Field f = sc.getClass().getDeclaredField("stopped"); 
    f.setAccessible(true);
    AtomicBoolean stopped = (AtomicBoolean) f.get(sc);
    return stopped.get();
}

Upvotes: 1

Avihoo Mamka
Avihoo Mamka

Reputation: 4786

This applies to the Scala/Java API for that writing time

Before Spark has released version 1.6, you wouldn't be able to check it, but only to trigger it:

sc.stop()

From version 1.6 and above, you have a boolean function that returns true if context is stopped or in the midst of stopping:

sc.isStopped

This applies to PySpark API

Thanks for @zero323 comment:

sc._jsc.sc().isStopped()

Which gives you the Java SparkContext.

Upvotes: 16

Related Questions