Reputation: 1330
I have a below dataframe with column :
df
id dt
1 2016/2017 Q2
2 2017/2018 Q1
3 2018/2019 Q2
output:
df
id date
1 2016-07-01
2 2017-04-01
3 2018-07-01
I need to convert them into date in pyspark Usually, I use the below code to convert to date by specifying format but couldn't find any format for quarter, please could you advise.
code: F.from_unixtime(F.unix_timestamp(date_str, fmt)).cast("date")
Upvotes: 0
Views: 1353
Reputation: 31490
I think there is no direct function/format that will return quarter date.
We need to use when
statement (or) udf
for this case.
Example:
df=spark.createDataFrame([("1","2016/2017 Q2"),("2","2017/2018 Q1"),("3","2018/2019 Q3"),("4","2019/2020 Q4")],["id","dt"])
#4 quarters in an year
df.withColumn("date",
when(lower(reverse(split(col("dt")," "))[0]) == "q1",concat_ws("-",substring(col("dt"),0,4),lit("01-01")).cast("date")).\
when(lower(reverse(split(col("dt")," "))[0]) == "q2",concat_ws("-",substring(col("dt"),0,4),lit("04-01")).cast("date")).\
when(lower(reverse(split(col("dt")," "))[0]) == "q3",concat_ws("-",substring(col("dt"),0,4),lit("07-01")).cast("date")).\
when(lower(reverse(split(col("dt")," "))[0]) == "q4",concat_ws("-",substring(col("dt"),0,4),lit("10-01")).cast("date")).\
otherwise(lit("Quarter not found"))).show()
#+---+------------+----------+
#| id| dt| date|
#+---+------------+----------+
#| 1|2016/2017 Q2|2016-04-01|
#| 2|2017/2018 Q1|2017-01-01|
#| 3|2018/2019 Q3|2018-07-01|
#| 4|2019/2020 Q4|2019-10-01|
#+---+------------+----------+
Upvotes: 2