Up Ap
Up Ap

Reputation: 107

Using Filter Condition While Joining Spark Dataframes: Spark/Scala

can someone please suggest me how to use filter while joining 2 dataframes in spark scala.I am trying below code.

    var name="abcd"
    var last_name="xyz"

    val df3 = df1.join(df2, df1("id") === df2("id"))
    .filter(df1("name")==='${name}').
    filter(df1("last_name")==='${last_name}')
    .drop(df1("name"))
    .drop(df2("name"))

But getting multiple error.

enter image description here

Upvotes: 0

Views: 573

Answers (1)

QuickSilver
QuickSilver

Reputation: 4045

Spark is not like java's JDBC APIs where we need wrap string with single quotes for where condition. Can you simple try using name variable w/o any quotes and $ sign

    var name="abcd"
    var last_name="xyz"
    val df3 = df1.join(df2, df1("id") === df2("id"))
    .filter(df1("name")===name && df1("last_name")===last_name)
    .drop(df1("name"))
    .drop(df2("name"))

Upvotes: 2

Related Questions