pythonic
pythonic

Reputation: 21685

Can I call the SparkContext constructor twice?

I need to do something like the following.

val conf = new SparkConf().setAppName("MyApp")
val master = new SparkContext(conf).master

if (master == "local[*]") // running locally
{
  conf.set(...)
  conf.set(...)
}
else // running on a cluster
{
  conf.set(...)
  conf.set(...)
}

val sc = new SparkContext(conf)

I first check whether I am running in local mode or cluster mode, and set the conf properties accordingly. But just to know about the master, I first have to create a SparkContext object. And after setting the conf properties, I obviously create another SparkContext object. Is this fine? Or Spark would just ignore my second constructor? If that is the case, in what other way I can find about the master (whether local or in cluster mode that is) before creating the SparkContext object?

Upvotes: 0

Views: 203

Answers (1)

puhlen
puhlen

Reputation: 8529

Starting multiple contexts at the same time will give an error.

You can get around this by stopping the first context before creating the second.

master.stop()
val sc = new SparkContext(conf)

It's silly to do this though, you can get the master from the spark conf without needing to start a spark context.

conf.get("spark.master")

Upvotes: 2

Related Questions