chinex
chinex

Reputation: 45

Convert Java Timestamp Datatype to Scala TimestampType

Is it possible to cast/convert a Java Timestamp Datatype to Scala TimestampType and vice-versa ?

I tried doing so this way:

val t = <Java Timestamp variable>.asInstanceOf[TimestampType]

But got this error:

java.lang.ClassCastException: java.sql.Timestamp cannot be cast to org.apache.spark.sql.types.TimestampType

Upvotes: 0

Views: 3050

Answers (1)

Boris Azanov
Boris Azanov

Reputation: 4481

In Spark org.apache.spark.sql.types.Timestamp - is subclass of abstract class DataType. All such subclasses is like just meta-information types of DataFrame columns. They doesn't contain some value but java.sql.Timestamp does it. And they are not subclasses, that is the reason you can't cast it using asInstanceOf.

Give you a small example to feel the difference:

when you just store data into DataFrame Spark will cast it by itself to spark.Timestamp

import java.sql.Timestamp    

val t = new Timestamp(System.currentTimeMillis())
val dfA: DataFrame = Seq(
  ("a", t),
  ("b", t),
  ("c", t)
).toDFc"key", "time")

but if you want to read data and get java.Timestamp you can do it so:

dfA.collect().foreach{
  row =>
    println(row.getAs[Timestamp](1))
} 
// will prints 
2020-07-31 00:45:48.825
2020-07-31 00:45:48.825
2020-07-31 00:45:48.825

if you will look at DataFrame schema:

dfA.printSchema()
dfA.schema.fields.foreach(println)

it will prints:

root
 |-- key: string (nullable = true)
 |-- time: timestamp (nullable = true)

StructField(key,StringType,true)
StructField(time,TimestampType,true)

but if you will try to cast java.Timestamp using asInctanceOf you will get fairly error:

println(t.asInstanceOf[TimestampType]) 
/*
java.sql.Timestamp incompatible with 
    org.apache.spark.sql.types.TimestampType java.lang.ClassCastException: java.sql.Timestamp incompatible with org.apache.spark.sql.types.TimestampType
/*

Upvotes: 2

Related Questions