Reputation: 171
@Override
public Option<DataType> getCatalystType(int sqlType, String typeName, int size, MetadataBuilder md) {
switch (sqlType) {
case java.sql.Types.JAVA_OBJECT:
switch (typeName) {
case "map(varchar(2147483647),varchar(2147483647))":
return Option.apply(DataTypes.createMapType(new StringType(), new StringType()));
}
break;
}
return super.getCatalystType(sqlType, typeName, size, md);
}
Code is used to support complex data type when return JAVA_OBJECT
. This same code, I have written in Scala which is working fine. But when use the above code in Java, It is giving:
Exception in thread "main" scala.MatchError: org.apache.spark.sql.types.StringType@582b14e2 (of class org.apache.spark.sql.types.StringType).
Scala code for reference:
override def getCatalystType(sqlType: Int, typeName: String, size: Int, md: MetadataBuilder): Option[DataType] = sqlType match {
case java.sql.Types.JAVA_OBJECT =>
typeName match {
case "map(varchar(2147483647),varchar(2147483647))" => Option(DataTypes.createMapType(StringType, StringType))
case "BIT" => Option(BooleanType)
case _ => None
}
case _ => None
}
Upvotes: 4
Views: 1229
Reputation: 11900
Use the singleton DataTypes.StringType
instead (as recommended):
...
...(DataTypes.createMapType(DataTypes.StringType, DataTypes.StringType))
The data type representing String values. Please use the singleton DataTypes.StringType.
Upvotes: 5