Rengan
Rengan

Reputation: 21

error: value cassandraTable is not a member of org.apache.spark.SparkContext

I want to access Cassandra table in Spark. Below are the version that I am using

Below is the script:

sc.stop

import com.datastax.spark.connector._, org.apache.spark.SparkContext, org.apache.spark.SparkContext._, org.apache.spark.SparkConf
val conf = new SparkConf(true).set("spark.cassandra.connection.host", "localhost")
val sc = new SparkContext(conf)
val test_spark_rdd = sc.cassandraTable("test1", "words")

when i run the last statement i get an error

:32: error: value cassandraTable is not a member of org.apache.spark.SparkContext val test_spark_rdd = sc.cassandraTable("test1", "words")

hints to resolve the error would be helpful.

Thanks

Upvotes: 2

Views: 3047

Answers (1)

Manoj Danane
Manoj Danane

Reputation: 31

Actually on shell you need to import respective packages. No need to do anything extra.

e.g. scala> import com.datastax.spark.connector._;

Upvotes: 3

Related Questions