Rohini Mathur
Rohini Mathur

Reputation: 441

Cannot resolve Error In Spark when filter records with two where condition

SPARK 1.6, SCALA, MAVEN

i have created a dataframe from RDD and trying to filter out all records where cola= null or empty string and colb = 2 or 3.

i tried something like this.

df.filter(WHERE $"COLA isnull AND COLB =02 & 03")

But unfortunately getting error as "cannot resolve 'COLA isnull where COLB =02 & 03'

Please help

Upvotes: 0

Views: 695

Answers (2)

Andrew
Andrew

Reputation: 8758

Messed up the syntax in my comment above.

$"cola".isNull && $"colb".isin("02","03")

That syntax works for me (2.1 and 2.4). If 1.6 doesn't like it, try it this way:

val foo = List("H","D")
df.filter($"COLA".isNull && $"colb".isin(foo_*))

Upvotes: 1

sangam.gavini
sangam.gavini

Reputation: 196

    You can try as below:

    df.filter($"COLA".isNull && $"COLB".isin("02","03"))

Refer for more details: Convert SQL Case Statement into Spark

Upvotes: 0

Related Questions