Georg Heiler
Georg Heiler

Reputation: 17694

Playframework Spark provisioning exception on reload

play and spark are both awesome. However I have some troubles combining them. Play offers the nice re-compilation mechanism. However it is not possible to re-instantiate a spark context.

If I had some errors in my code / changed some code and play re-compiles I unfortunately receive the following error:

ProvisionException: Unable to provision, see the following errors:

1) Error injecting constructor, org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true. The currently running SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:82)
controllers.Application.createSparkContext(Application.scala:38)
controllers.Application.<init>(Application.scala:35)
controllers.Application$$FastClassByGuice$$b5b6aa19.newInstance(<generated>)

One workaround is to manually kill the play application and then re-run it. But this does not seem to be good. Any better ideas?

Upvotes: 1

Views: 170

Answers (1)

NieMaszNic
NieMaszNic

Reputation: 617

I had the same problem. There is a solution which worked for me:

Define LocalSparkProvider.scala object:

object LocalSparkProvider {
  val sparkContext = new SparkContext(new SparkConf().setAppName("myApplication").setMaster("local"))
  val sqlContext = new SQLContext = new SQLContext(sparkContext)
}

Now create Global.scala object under the root package (in Play application it's "app" directory)

import play.api.GlobalSettings

object Global extends GlobalSettings {
  override def onStop(app: play.api.Application): Unit = { // Use an explicit definition of the package!
    LocalSparkProvider.sparkContext.stop()
  }
}

When the application is reloaded then Play triggers onStop method. Which can be used to stop spark context.

Upvotes: 3

Related Questions