hehe
hehe

Reputation: 387

type arguments [T] do not conform to method product's type parameter bounds [T <: Product]

I want to use this code to read csv file. but it cause an generics error. why? I think I was specified type like "".

def readMoviesData[T](spark: SparkSession, dataPath: String): Dataset[T] = {
  import spark.implicits._
  spark.read.format("csv").schema(Encoders.product[T].schema)
    .option("header","true").load(dataPath).as[T]
}

def analysisMovies(dataPath: String): Unit = {
  val spark = SparkSession.builder().appName("analysis movies data").getOrCreate()
  val movies: Dataset[MovieModel] = readMoviesData(spark, dataPath + "/movies.csv")
  movies.createOrReplaceTempView("movies")
  spark.sql("select count(*) from movies")
}

error

Error:(10, 53) type arguments [T] do not conform to method product's type parameter bounds [T <: Product]
spark.read.format("csv").schema(Encoders.product[T].schema)

Upvotes: 6

Views: 5531

Answers (2)

Boris Azanov
Boris Azanov

Reputation: 4491

try to add Product : TypeTag in T bounds and use implicit encoder for T:

import scala.reflect.runtime.universe.TypeTag
def readMoviesData[T <: Product : TypeTag](spark: SparkSession, dataPath: String): Dataset[T] = {
  implicit val encoder: Encoder[T] = Encoders.product[T]
  spark.read.format("csv").schema(encoder.schema)
    .option("header","true").load(dataPath).as[T]
}

Upvotes: 7

J&#246;rg W Mittag
J&#246;rg W Mittag

Reputation: 369554

The error message says:

product expects its type parameter to be a subtype of Product, but your type parameter is unrestricted, and thus could be anything, including something that is not a subtype of Product.

So, one way of solving it would be to make sure that you restrict T to be a subtype of Product.

Upvotes: 4

Related Questions