ashK
ashK

Reputation: 733

Date add/subtract in cassandra/spark query

I have a scenario where I need to join multiple tables and identify if the date + another integer column is greater than another date column.

Select case when (manufacturedate + LeadTime < DueDate) then numericvalue ((DueDate - manufacturepdate) + 1) else PartSource.EffLeadTime)

Is there a way to handle it in spark sql?

Thanks, Ash

Upvotes: 1

Views: 2185

Answers (2)

aravinth
aravinth

Reputation: 155

I tried with sqlcontext, there is a date_add('date',integer). date_add() is hive functionality and it works for cassandra context too.

cc.sql("select date_add(current_date(),1) from table").show

Thanks Aravinth

Upvotes: 2

Daniel de Paula
Daniel de Paula

Reputation: 17872

Assuming you have a DataFrame with your data, you are using Scala and the "another integer" represents a number of days, one way to do it is the following:

import org.apache.spark.sql.functions._    

val numericvalue = 1
val column = when( 
  datediff(col("DueDate"), col("manufacturedate")) > col("LeadTime"), lit(numericvalue)
).otherwise(col("PartSource.EffLeadTime"))
val result = df.withColumn("newVal", column)

The desired value will be in a new column called "newVal".

Upvotes: 1

Related Questions