Jack Sa
Jack Sa

Reputation: 339

Spark : Error Not found value SC

I have just started with Spark. I have CDH5 Installed with Spark . However when I try to use sparkcontext it gives Error as below

<console>:17: error: not found: value sc
       val distdata = sc.parallelize(data)

I have researched about this and found error: not found: value sc

and tried to start spark context with ./spark-shell . It gives error No such File or Directory

Upvotes: 11

Views: 43661

Answers (6)

mohamed mostafa
mohamed mostafa

Reputation: 1

you ca run this command in spark(scala) prompt
conf.set("spark.driver.allowMultipleContexts","true")

Upvotes: 0

Zvonko
Zvonko

Reputation: 301

There is another stackoverflow post that answers this question by getting sc(spark context) from spark session. I do it this way:

val spark = SparkSession.builder().appName("app_name").enableHiveSupport().getOrCreate()

val sc = spark.sparkContext

original answer here: Retrieve SparkContext from SparkSession

Upvotes: 4

You need to run Hadoop daemons first (run this command "start-all.sh"). Then try

Upvotes: 0

assasinC
assasinC

Reputation: 87

Starting a new terminal fixes the problem in my case.

Upvotes: 0

Ani Menon
Ani Menon

Reputation: 28199

Add spark directory to path then you may use spark-shell from anywhere.

Add import org.apache.spark.SparkContext if you are using it in a spark-submit job to create a spark context using:

val sc = new SparkContext(conf)

where conf is already defined.

Upvotes: 3

Nhor
Nhor

Reputation: 3940

You can either start spark-shell starting with ./ if you're in its exact directory or path/to/spark-shell if you're elsewhere.

Also, if you're running a script with spark-submit, you need to initialize sc as SparkContext first:

import org.apache.spark.SparkContext
import org.apache.spark.SparkConf

val conf = new SparkConf().setAppName("Simple Application")
val sc = new SparkContext(conf)

Upvotes: 5

Related Questions