shakedzy
shakedzy

Reputation: 2893

Stopping all working SparkContext?

I'm trying to test some Scala code on an IntelliJ worksheet. Even when all I write is this:

import org.apache.spark.rdd.RDD
import org.apache.spark.sql.hive.HiveContext
import org.apache.spark.sql.types.StructType
import org.apache.spark.sql.{DataFrame, Row, SQLContext}
import org.apache.spark.{SparkConf, SparkContext, SparkException}

val sc = new SparkContext()
val sqlContext = new HiveContext(sc)

I get:

WARN SparkContext: Another SparkContext is being constructed (or threw an exception in its constructor).  This may indicate an error, since only one SparkContext may be running in this JVM (see SPARK-2243). The other SparkContext was created at:
org.apache.spark.SparkContext.<init>(SparkContext.scala:123)

Despite this being a completely fresh peace of code. I haven't ran anything since my computer booted.

Any ideas how I can fix this?

Upvotes: 2

Views: 3468

Answers (1)

Tzach Zohar
Tzach Zohar

Reputation: 37832

You can stop() the context at the end of the Worksheet, so that each time it runs, it leaves no running context:

val sc = new SparkContext()
val sqlContext = new HiveContext(sc)

try {
  // your code here... 
} finally {
  sc.stop()
}

Upvotes: 1

Related Questions