Reputation: 142
Trying to do a simple transformation using unix_timestamp
behave differently from Spark 2.0.2
and 2.3.x
.
At first, I was thinking it could be a Spark environment related issue, like timezone differences. But all the settings are the same.
The example below shows the described behavior.
import org.apache.spark.sql.types.{TimestampType}
case class Dummy(mts:String, sts:String)
val testData = Seq(Dummy("2018-05-09-06.57.53.013768", "2018-05-09-06.57.53.013198"), Dummy("2018-11-21-04.30.03.804441", "2018-11-21-04.30.03.802212")).toDF
val result = testData
.withColumn("time1", unix_timestamp(col("sts"), "yyyy-MM-dd-HH.mm.ss.SSSSSS").cast(TimestampType))
.withColumn("time2", unix_timestamp(col("sts"), "yyyy-MM-dd-HH.mm.ss.SSSSSS").cast(TimestampType))
result.select($"time1", $"time2", $"sts", $"mts").show(false)
scala> spark.version
res25: String = 2.3.1.3.0.1.0-187
scala> result.select("time1", "time2", "sts", "mts").show(false)
+-----+-----+--------------------------+--------------------------+
|time1|time2|sts |mts |
+-----+-----+--------------------------+--------------------------+
|null |null |2018-05-09-06.57.53.013198|2018-05-09-06.57.53.013768|
|null |null |2018-11-21-04.30.03.802212|2018-11-21-04.30.03.804441|
+-----+-----+--------------------------+--------------------------+
scala>
scala> spark.version
def version: String
scala> spark.version
res4: String = 2.0.2
scala> result.select("time1", "time2", "sts", "mts").show(false)
+---------------------+---------------------+--------------------------+--------------------------+
|time1 |time2 |sts |mts |
+---------------------+---------------------+--------------------------+--------------------------+
|2018-05-09 06:58:06.0|2018-05-09 06:58:06.0|2018-05-09-06.57.53.013198|2018-05-09-06.57.53.013768|
|2018-11-21 04:43:25.0|2018-11-21 04:43:27.0|2018-11-21-04.30.03.802212|2018-11-21-04.30.03.804441|
+---------------------+---------------------+--------------------------+--------------------------+
Is there any particular reason for this behavior?
Upvotes: 1
Views: 167
Reputation: 2804
The problem you have is related to the function unix_timestamp
.
It converts a string to Unix timestamp in seconds. So anything after seconds is ignored.
Spark 2.0.2 was quite forgiving and substituted the SSSSSS
part of your pattern with a 0.
However, somewhere between Spark 2.0.2 and 2.3.x, the implementation changed and it, and you have a null
to call your attention to it.
How to solve it? Just remove the .SSSSSS
and it looks like this:
val result = testData
.withColumn("time1", unix_timestamp(col("sts"), "yyyy-MM-dd-HH.mm.ss").cast(TimestampType))
.withColumn("time2", unix_timestamp(col("sts"), "yyyy-MM-dd-HH.mm.ss").cast(TimestampType))
result.select("time1", "time2", "sts", "mts").show(false)
+-------------------+-------------------+--------------------------+--------------------------+
|time1 |time2 |sts |mts |
+-------------------+-------------------+--------------------------+--------------------------+
|2018-05-09 06:57:53|2018-05-09 06:57:53|2018-05-09-06.57.53.013198|2018-05-09-06.57.53.013768|
|2018-11-21 04:30:03|2018-11-21 04:30:03|2018-11-21-04.30.03.802212|2018-11-21-04.30.03.804441|
+-------------------+-------------------+--------------------------+--------------------------+
Upvotes: 1