Reputation: 339
I have the following code that seems to be very lengthy, is there a simplified format that can be applied to achieve the same result. What I am trying to achieve is to get the start and end date of a week and count the records for that particular week. Code : Create a dataframe:
new_list = [
{"inv_dt":"01/01/2020","count":1},
{"inv_dt":"02/01/2020", "count":2},
{"inv_dt":"10/01/2020", "count":5},
{"inv_dt":"11/01/2020","count":1},
{"inv_dt":"12/01/2020", "count":5},
{"inv_dt":"20/01/2020", "count":3},
{"inv_dt":"22/01/2020", "count":2},
{"inv_dt":"28/01/2020", "count":1}
]
from pyspark.sql import functions as F
from pyspark.sql import Row
df = spark.createDataFrame(Row(**x) for x in new_list)
Now i am converting the string to date format:
df = df.withColumn("inv_dt",F.to_date("inv_dt", "dd/MM/yyyy"))
df.show()
+----------+-----+
| inv_dt|count|
+----------+-----+
|2020-01-01| 1|
|2020-01-02| 2|
|2020-01-10| 5|
|2020-01-11| 1|
|2020-01-12| 5|
|2020-01-20| 3|
|2020-01-22| 2|
|2020-01-28| 1|
+----------+-----+
getting the week of the year
df = df.withColumn('week_of_year',F.weekofyear(df.inv_dt))
df.show()
+----------+-----+------------+
| inv_dt|count|week_of_year|
+----------+-----+------------+
|2020-01-01| 1| 1|
|2020-01-02| 2| 1|
|2020-01-10| 5| 2|
|2020-01-11| 1| 2|
|2020-01-12| 5| 2|
|2020-01-20| 3| 4|
|2020-01-22| 2| 4|
|2020-01-28| 1| 5|
+----------+-----+------------+
using selectExpr to get the start and end of the week, joining start and end as Week_Period, then groupby to get the count per week
df = df.withColumn('day_of_week', F.dayofweek(F.col('inv_dt')))
df = df.selectExpr('*', 'date_sub(inv_dt, day_of_week-1) as week_start')
df = df.selectExpr('*', 'date_add(inv_dt, 7-day_of_week) as week_end')
df = df.withColumn('Week_Period', F.concat(F.col('week_start'),F.lit(' - '), F.col('week_end')))
list_of_columns = ['week_of_year','Week_Period']
df = df.groupby([F.col(x) for x in list_of_columns]).agg(F.sum(F.col('count')).alias('count'))
df.sort(df.week_of_year).show()
+------------+--------------------+-----+
|week_of_year| Week_Period|count|
+------------+--------------------+-----+
| 1|2019-12-29 - 2020...| 3|
| 2|2020-01-05 - 2020...| 6|
| 2|2020-01-12 - 2020...| 5|
| 4|2020-01-19 - 2020...| 5|
| 5|2020-01-26 - 2020...| 1|
+------------+--------------------+-----+
Upvotes: 2
Views: 3279
Reputation: 19308
This code is cleaner.
list_of_columns = ['week_of_year','Week_Period']
df\
.withColumn("day_of_week", F.dayofweek(F.col("inv_dt")))\
.withColumn("week_end", F.next_day(F.col("inv_dt"), 'Sat'))\
.withColumn("week_start", F.date_add(F.col("week_end"), -6))\
.withColumn('Week_Period', F.concat(F.col('week_start'),F.lit(' - '), F.col('week_end')))\
.groupby([F.col(x) for x in list_of_columns]).agg(F.sum(F.col('count')).alias('count'))\
.sort(df.week_of_year)\
.show(truncate = False)
+------------+-----------------------+-----+
|week_of_year|Week_Period |count|
+------------+-----------------------+-----+
|1 |2019-12-29 - 2020-01-04|3 |
|2 |2020-01-05 - 2020-01-11|5 |
|2 |2020-01-12 - 2020-01-18|6 |
|4 |2020-01-19 - 2020-01-25|5 |
|5 |2020-01-26 - 2020-02-01|1 |
+------------+-----------------------+-----+
Upvotes: 3