jamiet
jamiet

Reputation: 12364

java.lang.InterruptedException when creating SparkSession in Scala

If I clone this gist: https://gist.github.com/jamiekt/cea2dab3ea8de91489b31045b302e011

and then issue sbt run it fails on the line

val spark = SparkSession.builder()
                        .config(new SparkConf().setMaster("local[*]"))
                        .enableHiveSupport()
                        .getOrCreate()

with error:

Java.lang.InterruptedException at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2014)

No clue why this might be happening. Anyone got a suggestion?

Scala version is 2.11.12 (see in build.sbt in the gist)
Spark version is 2.3.0 (again, see in build.sbt)
Java Version

$ java -version
java version "1.8.0_161"

Upvotes: 4

Views: 8045

Answers (1)

Ramesh Maharjan
Ramesh Maharjan

Reputation: 41987

The error is because you have not stopped the sparkSession instance created and the instance is removed from memory without being closed as soon as sbt run completes i.e. after the successful completion of your code.

So all you require is

  spark.stop()

at the end of the scope where the instance is created as

object Application extends App{
  import DataFrameExtensions_._
  val spark = SparkSession.builder().config(new SparkConf().setMaster("local[*]")).enableHiveSupport().getOrCreate()
  //import spark.implicits._
  //val df = Seq((8, "bat"),(64, "mouse"),(-27, "horse")).toDF("number", "word")
  //val groupBy = Seq("number","word")
  //val asAt = LocalDate.now()
  //val joinedDf = Seq(df.featuresGroup1(_,_), df.featuresGroup2(_,_)).map(_(groupBy, asAt)).joinDataFramesOnColumns(groupBy)
  //joinedDf.show

  spark.stop()
}

Just before the

Java.lang.InterruptedException at java.util.concurrent.locks.AbstractQueuedSynchronizer$ConditionObject.reportInterruptAfterWait(AbstractQueuedSynchronizer.java:2014)

You must have following message too

ERROR Utils: uncaught error in thread SparkListenerBus, stopping SparkContext

which gives clue to the cause of the error.

Upvotes: 4

Related Questions