Karan Gehlod
Karan Gehlod

Reputation: 357

Spark Session Dispose after Unit test for specified file is Done

I'm Writing Unit Tests for Spark Scala code and facing this issue. When I run UnitTests files separately I'm good to go but, When I run all of UnitTests in module using maven Testcases fails. How we can create local instance of spark or mock for UnitTests. `

Cannot call methods on a stopped SparkContext. This stopped SparkContext was created at:

org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:947)

` Method I tried.

  1. Tried using creating private spark session for each one UnitTest File.

  2. Creating common spark session trait for all unit test file.

  3. calling spark.Stop() at end of each file and removing it from all. File are make two unit test files and try to execute them together. Both files should have spark session.

    Class test1 extends AnyFlatSpec
        {
          val spark: SparkSession = SparkSession.builder
            .master("local[*]")
            .getOrCreate()
          val sc: SparkContext = spark.sparkContext
          val sqlCont: SQLContext = spark.sqlContext
        "test1" should "take spark session spark context and sql context" in
        {
        //do something
        }
        }
    
    Class test2 extends AnyFlatSpec
     {
       val spark: SparkSession = SparkSession.builder
         .master("local[*]")
         .getOrCreate()
       val sc: SparkContext = spark.sparkContext`enter code here`
       val sqlCont: SQLContext = spark.sqlContext
     "test2" should "take spark session spark context and sql context" in
     {
     //do something
     }
     }
    

when you run those independently each file will work fine but when you run them together using mvn test they will failed.

Upvotes: 1

Views: 879

Answers (1)

Karan Gehlod
Karan Gehlod

Reputation: 357

I found an article which solves problem. Spark Unit Tests

Upvotes: 1

Related Questions