Reputation: 2138
I have mixed Java/Scala project. There are Quartz jobs that are implemented in Java and use some Scala classes. These classes should use the same SparkContext instance, so I implemented something that should be singleton and looks like this:
object SparkContextLoader {
var hasSC = false
var sc:Any = 0
def getSC(workers):SparkContext={
if (!hasSC) {
val sparkConf = new SparkConf().setMaster("local[" + workers + "]").setAppName("SparkApp")
sc = new SparkContext(sparkConf)
hasSC = true
}
return sc.asInstanceOf[SparkContext]
}
Calling SparkContextLoader from two different jobs always creates a new SparkContext instance which is not allowed.
Why Scala object doesn't behave like singleton?
Upvotes: 1
Views: 1216
Reputation: 17933
Your code is overly complicated. If the "two different jobs" are different threads then all you need is:
object SparkContextLoader {
val sparkConf = new SparkConf().setMaster("local[" + workers + "]").setAppName("SparkApp")
val sc = new SparkContext(sparkConf)
}
You can then access these vals using the answer to this question.
If the "two different jobs" are different java applications then it doesn't seem that there is a way to share a singleton across both of them.
Upvotes: 2