Reputation: 499
Based on the suggestion from here, I would like to know how do I filter the datetime ranges with timezone using PySpark.
Here is how my data looks like:
ABC, 2020-06-22T19:17:16.428+0000
DEF, 2020-06-22T19:17:16.435+0000
JKL, 2020-06-22T19:17:16.468+0000
MNO, 2020-06-22T19:17:16.480+0000
XYZ, 2020-06-22T19:17:16.495+0000
I would only like to extract those records that has milliseconds between 400-450 in this case.
Tried this but didn't work:
import pyspark.sql.functions as func
df = df.select(func.to_date(df.UpdatedOn).alias("time"))
sf = df.filter(df.time > '2020-06-22T19:17:16.400').filter(df.time < '2020-06-22T19:17:16.451')
Upvotes: 0
Views: 1167
Reputation: 13541
When you use to_date
it will truncate the hours, so you have to use to_timestamp
and compare it.
df.withColumn('date', to_timestamp('date')) \
.filter("date between to_timestamp('2020-06-22T19:17:16.400') and to_timestamp('2020-06-22T19:17:16.451')") \
.show(10, False)
+---+-----------------------+
|id |date |
+---+-----------------------+
|ABC|2020-06-22 19:17:16.428|
|DEF|2020-06-22 19:17:16.435|
+---+-----------------------+
Upvotes: 1