Pat64
Pat64

Reputation: 153

Transform sequence of strings to join columns

I have the following Sequence and DataFrames:

df1.select("link1", "link2").show
+-----+-----+
|link1|link2|
+-----+-----+
|    1|    1|
|    2|    1|
|    2|    1|
|    3|    1|
|    5|    2|
+-----+-----+

df2.select("link1_2", "link2_2").show
+-------+-------+
|link1_2|link2_2|
+-------+-------+
|      2|      1|
|      2|      4|
|      4|      1|
|      5|      2|
|      3|      4|
+-------+-------+

val col_names = Seq("link1", "link2")

I want to create the following link

df1.join(df2, 'link1 === 'link1_2 && 'link2 === 'link1_2) 

without hard-coding the linking columns. I basically need a way to do the following transformation:

Seq("str1", "str2", ...) -> 'str1 === 'str1_2 && 'str2 === 'str1_2 && ...

I have tried the following approach which doesn't seem to work:

df1.join(df2, col_names map (str: String => col(str) === col(str + "_2")).foldLeft(true)(_ && _))

Does anybody know how to write the above transformation?

Upvotes: 0

Views: 235

Answers (1)

Leo C
Leo C

Reputation: 22449

There is no need to traverse the column list twice. Just use foldLeft as shown below:

import org.apache.spark.sql.functions._
import spark.implicits._

val df1 = Seq(
  (1, 1), (2, 1), (2, 1), (3, 1), (5, 2)
).toDF("c1", "c2")

val df2 = Seq(
  (2, 1), (2, 4), (4, 1), (5, 2), (3, 4)
).toDF("c1_2", "c2_2")

val cols = Seq("c1", "c2")

df1.
  join(df2, cols.foldLeft(lit(true))((cond, c) => cond && col(c) === col(c + "_2"))).
  show
//+---+---+----+----+                                                             
//| c1| c2|c1_2|c2_2|
//+---+---+----+----+
//|  2|  1|   2|   1|
//|  2|  1|   2|   1|
//|  5|  2|   5|   2|
//+---+---+----+----+

Upvotes: 1

Related Questions