Justin
Justin

Reputation: 745

Exception in thread "main" org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243)

i am getting an error when i am trying to run a spark application with cassandra.

Exception in thread "main" org.apache.spark.SparkException: Only one SparkContext may be running in this JVM (see SPARK-2243). 

I am using spark version 1.2.0 and its clear that i am only using one spark context in my application. But whenever i try to add following code for streaming purpose am getting this error.

JavaStreamingContext activitySummaryScheduler = new JavaStreamingContext(
            sparkConf, new Duration(1000));

Upvotes: 2

Views: 4097

Answers (3)

Areeha
Areeha

Reputation: 833

One way could be as follows:

    SparkConf sparkConf = new SparkConf().setAppName("Example Spark App").setMaster("local[*]");
    JavaSparkContext jssc = new JavaSparkContext(sparkConf);
    JavaStreamingContext jsc = new JavaStreamingContext(jssc, new Duration(1));

Upvotes: 1

Chenna V
Chenna V

Reputation: 10483

Take a look at the second code snippet here enter link description here

This is how your code should look like

import org.apache.spark.streaming.api.java.*;

JavaSparkContext existingSparkContext = ...   //existing JavaSparkContext
JavaStreamingContext activitySummaryScheduler = new JavaStreamingContext(existingSparkContext, Durations.seconds(1000));

Upvotes: 1

RussS
RussS

Reputation: 16576

You can only have one SparkContext at a time and since a StreamingContext has a SparkContext in it you can't have a separate Streaming and Spark Context in the same code. What you can do is build a StreamingContext off of your SparkContext so you can have access to both if you really need that.

Use this constructor JavaStreamingContext(sparkContext: JavaSparkContext, batchDuration: Duration)

Upvotes: 5

Related Questions