Yash
Yash

Reputation: 1090

Schema for type Any is not supported

I'm trying to create a spark UDF to extract a Map of (key, value) pairs from a User defined case class.

The scala function seems to work fine, but when I try to convert that to a UDF in spark2.0, I'm running into the " Schema for type Any is not supported" error.

case class myType(c1: String, c2: Int)
def getCaseClassParams(cc: Product): Map[String, Any] = {

    cc
      .getClass
      .getDeclaredFields // all field names
      .map(_.getName)
      .zip(cc.productIterator.to) // zipped with all values
      .toMap

  }

But when I try to instantiate a function value as a UDF it results in the following error -

val ccUDF = udf{(cc: Product, i: String) => getCaseClassParams(cc).get(i)}

java.lang.UnsupportedOperationException: Schema for type Any is not supported
  at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:716)
  at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:668)
  at org.apache.spark.sql.catalyst.ScalaReflection$.schemaFor(ScalaReflection.scala:654)
  at org.apache.spark.sql.functions$.udf(functions.scala:2841)

Upvotes: 3

Views: 8539

Answers (1)

Assaf Mendelson
Assaf Mendelson

Reputation: 12991

The error message says it all. You have an Any in the map. Spark SQL and Dataset api does not support Any in the schema. It has to be one of the supported type (which is a list of basic types such as String, Integer etc. a sequence of supported types or a map of supported types).

Upvotes: 5

Related Questions