shanlodh
shanlodh

Reputation: 1045

How to have more StreamingContexts in a single Spark application?

I'm trying to run a spark streaming job on the spark-shell, localhost. Following the code from here this is what I first tried:

import org.apache.spark._
import org.apache.spark.streaming._
import org.apache.spark.streaming.StreamingContext._ 

val conf = new SparkConf().setMaster("local[2]").setAppName("NetworkWordCount")
val ssc = new StreamingContext(conf, Seconds(30))

Which gives following errors:

org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). To ignore this error, set spark.driver.allowMultipleContexts = true.

And so I had to try this:

import org.apache.spark._
import org.apache.spark.streaming._
import org.apache.spark.streaming.StreamingContext._ 

val conf = new SparkConf().setMaster("local[2]").setAppName 
("NetworkWordCount").set("spark.driver.allowMultipleContexts", "true")
val ssc = new StreamingContext(conf, Seconds(30))

This runs but with the following warning:

2018-05-17 17:01:14 WARN SparkContext:87 - Multiple running SparkContexts detected in the same JVM!

So I'd like to know if there might be another way of declaring a StreamingContext object that does not require allowMulipleContexts == True as it appears using multiple contexts is discouraged? Thanks

Upvotes: 1

Views: 369

Answers (1)

philantrovert
philantrovert

Reputation: 10082

You need to use the existing SparkContext sc to create the StreamingContext

val ssc = new StreamingContext(sc, Seconds(30))

When you create it using the alternate constructor i.e the one with SparkConf it internally creates another SparkContext and that's why you get that Warning.

Upvotes: 1

Related Questions