Afroz Shaikh
Afroz Shaikh

Reputation: 362

Query Cassandra table through Spark

I am trying to get values from Cassandra 2.0.17 table through spark-1.6.0 and scala-2.11.7 with the following steps

  1. Started cassandra -- service cassandra start
  2. Started spark-- sbin/start-all.sh
  3. stated spark scala -- bin/spark-shell --jars spark-cassandra-connector_2.10-1.5.0-M1.jar

executed these commands in scala

import org.apache.spark.SparkContext
import org.apache.spark.SparkConf
import org.apache.spark.SparkContext._

sc.stop

val conf = new SparkConf(true).set("spark.cassandra.connection.host","127.0.0.1")

val sc=new SparkContext("local[2]","test",conf)

import com.datastax.spark.connector._

everything works fine till here, but when i execute -

val rdd=sc.cassandraTable("tutorialspoint","emp")

It gives me the below error

error: bad symbolic reference. A signature in CassandraTableScanRDD.class refers to term driver
in package com.datastax which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling CassandraTableScanRDD.class.
error: bad symbolic reference. A signature in CassandraTableScanRDD.class refers to term core
in value com.datastax.driver which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling CassandraTableScanRDD.class.
error: bad symbolic reference. A signature in CassandraTableScanRDD.class refers to term core
in value com.datastax.driver which is not available.
It may be completely missing from the current classpath, or the version on
the classpath might be incompatible with the version used when compiling CassandraTableScanRDD.class.

Have added jars from cassandra lib to spark and refrenced it. my using java version 1.8.0_72

M i missing some thing?

Upvotes: 1

Views: 810

Answers (1)

anshul_cached
anshul_cached

Reputation: 762

The driver you are using is incompatible with your scala and spark version. You are using scala-2.11.7, but this driver is for scala 2.10. Also for spark this driver supports spark 1.5.x.

Upvotes: 1

Related Questions