Gadam
Gadam

Reputation: 3024

Spark Delta merge add Source column value to Target column value

I want the updated value in the target's column to be the sum of source value + target value

example:

%scala 
import org.apache.spark.sql.functions._
import io.delta.tables._ 

// Create example delta table 
val dept = Seq(("Finance",10), ("Marketing",20),("Sales",30), ("IT",40) )
val deptColumns = Seq("dept_name","dept_emp_count")
val deptDF = dept.toDF(deptColumns:_*) 
deptDF.write.format("delta").mode("overwrite").saveAsTable("dept_table")

//create example stage dataframee
val staged_df = spark.sql("select * from dept_table").withColumn("dept_emp_count", lit(1))

//How to do this merge?
DeltaTable.forName(spark, "dept_table").as("events")
  .merge(staged_df.as("updates"), "events.dept_name = updates.dept_name")
.whenMatched()
.updateExpr(Map(                                      
  "dept_emp_count" -> lit("events.dept_emp_count") + lit("updates.dept_emp_count"))) // How do I write this line?
.execute()

Upvotes: 0

Views: 498

Answers (1)

Alex Ott
Alex Ott

Reputation: 87144

The value in that update Map is the SQL expression, so instead of the lit("events.dept_emp_count") + lit("updates.dept_emp_count") you just need to write "events.dept_emp_count + updates.dept_emp_count"

Upvotes: 1

Related Questions