Chankin
Chankin

Reputation: 137

Using spark context outside main

I'm sorry if this is an elementary question but I could not find an answer searching for it or I'm searching for the wrong thing.

I have two files in a program: main.scala and second.scala

object main {
  def main(args: Array[String]) = {
    /*load spark conf*/
    val sparkConf = new SparkConf().setAppName("main")
    val sc = new SparkContext(sparkConf)
    }
}

in a separate file

object second {
    val somelist = list(1,2,3)
    sc.parallelize(somelist)
}

I want to create and RDD in the second file but I cant call sc because it is out of scope? (no matter where I put sc or what imports I use)

How do I fix that?

Upvotes: 1

Views: 1153

Answers (1)

rogue-one
rogue-one

Reputation: 11587

sc is a method variable which exists only inside the method main. to use sc you will have to pass the context object as a parameter to a method as shown below.

object Second {
 def createRDD(sc: SparkContext) = {
    val somelist = list(1,2,3)
    sc.parallelize(somelist)
  }
}

call method createRDD in main

object main {
  def main(args: Array[String]) = {
    /*load spark conf*/
    val sparkConf = new SparkConf().setAppName("main")
    val sc = new SparkContext(sparkConf)
    Second.createRDD(sc)
    }
}

Upvotes: 1

Related Questions