sp_user123
sp_user123

Reputation: 502

Spark dynamic window calculation

Below is the sales data available to calculate max_price . Logic for Max_price

Max(last 3 weeks price)

For the first 3 weeks where last weeks data is not available max price will be

max of(week 1 , week 2 , week 3)

in the below example max (rank 5 , 6 ,7).

how to implement the same using window function in spark?

enter image description here

Upvotes: 0

Views: 1547

Answers (2)

Ranga Vure
Ranga Vure

Reputation: 1932

Here is the solution using PySpark Window, lead/udf.

Please note that i changed the rank 5,6,7 prices to 1,2,3 to differentiate with other values to explain . that this logic is picking what you explained.

max_price_udf = udf(lambda prices_list: max(prices_list), IntegerType())

df = spark.createDataFrame([(1, 5, 2019,1,20),(2, 4, 2019,2,18),
                            (3, 3, 2019,3,21),(4, 2, 2019,4,20),
                            (5, 1, 2019,5,1),(6, 52, 2018,6,2),
                            (7, 51, 2018,7,3)], ["product_id", "week", "year","rank","price"])

window = Window.orderBy(col("year").desc(),col("week").desc())

df = df.withColumn("prices_list", array([coalesce(lead(col("price"),x, None).over(window),lead(col("price"),x-3, None).over(window)) for x in range(1, 4)]))
df = df.withColumn("max_price",max_price_udf(col("prices_list")))

df.show()

which results

+----------+----+----+----+-----+------------+---------+
|product_id|week|year|rank|price| prices_list|max_price|
+----------+----+----+----+-----+------------+---------+
|         1|   5|2019|   1|   20|[18, 21, 20]|       21|
|         2|   4|2019|   2|   18| [21, 20, 1]|       21|
|         3|   3|2019|   3|   21|  [20, 1, 2]|       20|
|         4|   2|2019|   4|   20|   [1, 2, 3]|        3|
|         5|   1|2019|   5|    1|   [2, 3, 1]|        3|
|         6|  52|2018|   6|    2|   [3, 1, 2]|        3|
|         7|  51|2018|   7|    3|   [1, 2, 3]|        3|
+----------+----+----+----+-----+------------+---------+

Here is the solution in Scala

var df = Seq((1, 5, 2019, 1, 20), (2, 4, 2019, 2, 18),
         (3, 3, 2019, 3, 21), (4, 2, 2019, 4, 20),
         (5, 1, 2019, 5, 1), (6, 52, 2018, 6, 2),
         (7, 51, 2018, 7, 3)).toDF("product_id", "week", "year", "rank", "price")

val window = Window.orderBy($"year".desc, $"week".desc)

df = df.withColumn("max_price", greatest((for (x <- 1 to 3) yield coalesce(lead(col("price"), x, null).over(window), lead(col("price"), x - 3, null).over(window))):_*))

df.show()

Upvotes: 1

stack0114106
stack0114106

Reputation: 8711

You can use SQL window functions combined with the greatest(). When the SQL window function has less than 3 number of rows, you are considering the current rows and even prior rows. Therefore you need to have the lag1_price, lag2_price calculated in the inner sub-query. In the outer query, you can use the row_count value and use the greatest() function by passing in lag1, lag2 and current price for the respective values against 2,1,0 and get the maximum value.

Check this out:

val df = Seq((1, 5, 2019,1,20),(2, 4, 2019,2,18),
(3, 3, 2019,3,21),(4, 2, 2019,4,20),
(5, 1, 2019,5,1),(6, 52, 2018,6,2),
(7, 51, 2018,7,3)).toDF("product_id", "week", "year","rank","price")

df.createOrReplaceTempView("sales")

val df2 = spark.sql("""
          select product_id, week, year, price,
          count(*) over(order by year desc, week desc rows between 1 following and 3 following  ) as count_row,
          lag(price) over(order by year desc, week desc ) as lag1_price,
          sum(price) over(order by year desc, week desc rows between 2 preceding and 2 preceding ) as lag2_price,
          max(price) over(order by year desc, week desc rows between 1 following and 3 following  ) as max_price1 from sales
  """)
df2.show(false)
df2.createOrReplaceTempView("sales_inner")
spark.sql("""
          select product_id, week, year, price,
          case
             when count_row=2 then greatest(price,max_price1)
             when count_row=1 then greatest(price,lag1_price,max_price1)
             when count_row=0 then greatest(price,lag1_price,lag2_price)
             else  max_price1
          end as max_price
         from sales_inner
  """).show(false)

Results:

+----------+----+----+-----+---------+----------+----------+----------+
|product_id|week|year|price|count_row|lag1_price|lag2_price|max_price1|
+----------+----+----+-----+---------+----------+----------+----------+
|1         |5   |2019|20   |3        |null      |null      |21        |
|2         |4   |2019|18   |3        |20        |null      |21        |
|3         |3   |2019|21   |3        |18        |20        |20        |
|4         |2   |2019|20   |3        |21        |18        |3         |
|5         |1   |2019|1    |2        |20        |21        |3         |
|6         |52  |2018|2    |1        |1         |20        |3         |
|7         |51  |2018|3    |0        |2         |1         |null      |
+----------+----+----+-----+---------+----------+----------+----------+

+----------+----+----+-----+---------+
|product_id|week|year|price|max_price|
+----------+----+----+-----+---------+
|1         |5   |2019|20   |21       |
|2         |4   |2019|18   |21       |
|3         |3   |2019|21   |20       |
|4         |2   |2019|20   |3        |
|5         |1   |2019|1    |3        |
|6         |52  |2018|2    |3        |
|7         |51  |2018|3    |3        |
+----------+----+----+-----+---------+

Upvotes: 0

Related Questions