Nipponho
Nipponho

Reputation: 91

Spark Context is not automatically created in Scala Spark Shell

I read in a Spark book :

Driver programs access Spark through a SparkContext object, which represents a connection to a computing cluster. In the shell, a SparkContext is automatically created for you as the variable called sc. Try printing out sc to see its type

sc

When I enter sc, it gives me an error 20 value sc not found. Any idea why is sc not automatically created in my scala spark shell?

I try to manually create a sc and it gave me an error saying there is already a spark context in the JVM. Please see pic :

http://s30.photobucket.com/user/kctestingeas1/media/No%20Spark%20Context.jpg.html

I believe i am already in scala spark shell as you can see on the top of my cmd window indicating bin\spark-shell

Please advise. Thanks

Upvotes: 9

Views: 6200

Answers (1)

Tri Han
Tri Han

Reputation: 141

Hopefully you found the answer to your question, because I am encountering the same issue as well.

In the meantime, use this workaround. In the scala spark shell, enter:

  1. import org.apache.spark.SparkContext
  2. val sc = SparkContext.getOrCreate()

You then have access to sc.

Upvotes: 14

Related Questions