Metadata
Metadata

Reputation: 2083

could not find implicit value for evidence parameter of type org.apache.spark.sql.Encoder[String]

I am trying to load a dataframe into a Hive table.

import org.apache.spark.sql.SparkSession
import org.apache.spark.sql.SaveMode
import org.apache.spark.sql._

object SparkToHive {
  def main(args: Array[String]) {
    val warehouseLocation = "file:${system:user.dir}/spark-warehouse"
    val sparkSession = SparkSession.builder.master("local[2]").appName("Saving data into HiveTable using Spark")
                        .enableHiveSupport()
                        .config("hive.exec.dynamic.partition", "true")
                        .config("hive.exec.dynamic.partition.mode", "nonstrict")
                        .config("hive.metastore.warehouse.dir", "/user/hive/warehouse")
                         .config("spark.sql.warehouse.dir", warehouseLocation)
                        .getOrCreate()
    **import sparkSession.implicits._**
    val partfile = sparkSession.read.text("partfile").as[String]

    val partdata = partfile.map(part => part.split(","))
    case class Partclass(id:Int, name:String, salary:Int, dept:String, location:String)
    val partRDD  = partdata.map(line => PartClass(line(0).toInt, line(1), line(2).toInt, line(3), line(4)))
    val partDF   = partRDD.toDF()
    partDF.write.mode(SaveMode.Append).insertInto("parttab")
  }
}

I haven't executed it yet but I am getting the following error at this line:

import sparkSession.implicits._
could not find implicit value for evidence parameter of type org.apache.spark.sql.Encoder[String]

How can I fix this ?

Upvotes: 2

Views: 13628

Answers (2)

Metadata
Metadata

Reputation: 2083

The mistake I made was

  1. Case class should be outside the main and inside the object

  2. In this line: val partfile = sparkSession.read.text("partfile").as[String], I used read.text("..") to get a file into Spark where we can use read.textFile("...")

Upvotes: 1

Ramesh Maharjan
Ramesh Maharjan

Reputation: 41957

Please move your case class Partclass outside of SparkToHive object. It should be fine then

And there are ** in you implicits import statement. Try

import sparkSession.sqlContext.implicits._

Upvotes: 16

Related Questions