Reputation: 357
I'm Writing Unit Tests for Spark Scala code and facing this issue. When I run UnitTests files separately I'm good to go but, When I run all of UnitTests in module using maven Testcases fails. How we can create local instance of spark or mock for UnitTests. `
Cannot call methods on a stopped SparkContext. This stopped SparkContext was created at:
org.apache.spark.sql.SparkSession$Builder.getOrCreate(SparkSession.scala:947)
` Method I tried.
Tried using creating private spark session for each one UnitTest File.
Creating common spark session trait for all unit test file.
calling spark.Stop() at end of each file and removing it from all. File are make two unit test files and try to execute them together. Both files should have spark session.
Class test1 extends AnyFlatSpec { val spark: SparkSession = SparkSession.builder .master("local[*]") .getOrCreate() val sc: SparkContext = spark.sparkContext val sqlCont: SQLContext = spark.sqlContext "test1" should "take spark session spark context and sql context" in { //do something } }
Class test2 extends AnyFlatSpec
{
val spark: SparkSession = SparkSession.builder
.master("local[*]")
.getOrCreate()
val sc: SparkContext = spark.sparkContext`enter code here`
val sqlCont: SQLContext = spark.sqlContext
"test2" should "take spark session spark context and sql context" in
{
//do something
}
}
when you run those independently each file will work fine but when you run them together using mvn test they will failed.
Upvotes: 1
Views: 879