Shiva Achari
Shiva Achari

Reputation: 975

sbt package error on leftJoinWithCassandraTable spark cassandra

spark-cassandra function leftJoinWithCassandraTable works fine in spark-shell but while packaging in sbt I am getting below mentioned error.

My scala code snippet

case class time_struct(id: String, month: Int, day: Int, platform: String, type_1: String, title:String,
                      time: Long)
val rdd = time_data.mapPartitions( data =>   
          data.map( row =>
              time_struct(row.getString(0),row.getInt(1),row.getInt(2),row.getString(3),row.getString(4),row.getString(5),row.getDouble(6).toLong)))
val join = rdd.leftJoinWithCassandraTable("ks1", "tbl1",SomeColumns("time" ),
           SomeColumns("id"))

$ sbt package

[error] Test.scala:187: could not find implicit value for parameter rwf: com.datastax.spark.connector.writer.RowWriterFactory[time_struct]
[error]     val join = rdd.leftJoinWithCassandraTable("ks1", "tbl1",SomeColumns("time" ),
[error]                                              ^

build.sbt

scalaVersion := "2.11.8"

scalacOptions := Seq("-unchecked", "-deprecation", "-encoding", "utf8")

libraryDependencies ++= {
  val sparkV = "2.1.0"

  Seq(
    "org.apache.spark" %% "spark-core" % sparkV % "provided",
    "org.apache.spark" %% "spark-sql" % sparkV % "provided",

    "com.datastax.spark" % "spark-cassandra-connector_2.11" % "2.0.0-RC1",
    "com.databricks" %% "spark-csv" % "1.5.0"
)
}

libraryDependencies += "org.apache.hadoop" % "hadoop-aws" % "2.7.1"

Environment :

scala 2.11.8

spark 2.1.0

sbt 0.13.13

Upvotes: 0

Views: 127

Answers (1)

Shiva Achari
Shiva Achari

Reputation: 975

I got my mistake i had declared the case class inside the main function. After i moved the case class definition out of main function it compiled successfully.

Upvotes: 1

Related Questions