Matt Indeedhat Holmes
Matt Indeedhat Holmes

Reputation: 725

could not find implicit value for evidence parameter of type - Spark

Im new to Scala and spark and could do with some help regarding the above error. Here is a snippet of my code that is causing issues:

case class Session (user_id: String, creation_date: BigInt, offline: Boolean)
case class User (user_id: String, app_id: Int, vendor_code: String, app_version: String)

val users = sc.cassandraTable[User]("leech_seed", "user").select("user_id", "app_id", "vendor_code", "app_version").where("last_active >=" + (timestamp - 86400000))
val sessions = sc.cassandraTable[Session]("leech_seed", "session").select("user_id", "creation_date", "offline").where("creation_date < " + timestamp + " AND creation_date >=" + (timestamp - 86400000))

when i use this code in the spark shell it works fine but when i am trying to build a jar with sbt i get the following error could not find implicit value for evidence parameter of type com.datastax.spark.connector.rdd.reader.RowReaderFactory[User]

This has been doing my head in for longer than id like to admit so any help/insight would be greatly appreciated.

Note: I am using the datastax cassandra connector for spark

Upvotes: 0

Views: 668

Answers (1)

Gillespie
Gillespie

Reputation: 2228

Check your spark-cassandra connector version is up-to-date with the version of Spark you are using. I have encountered these issues using connector versions older than 2.10-1.4.0-M3 with Spark 1.4.1.

Also ensure that your case classes are defined outside of your main method - else you will encounter No RowReaderFactory can be found for this typeor similar.

Upvotes: 1

Related Questions