Pranav Waila
Pranav Waila

Reputation: 385

Filtering a pyspark dataframe

I am trying to select some values from a pyspark dataframe, based on some rules. getting an exception in pyspark.

from pyspark.sql import functions as F

df.select(df.card_key,F.when((df.tran_sponsor = 'GAMES') &  (df.location_code = '9145'),'ENTERTAINMENT').when((df.tran_sponsor = 'XYZ') &  (df.location_code = '123'),'eBOOKS').when((df.tran_sponsor = 'XYZ') &  (df.l_code.isin(['123', '234', '345', '456', '567', '678', '789', '7878', '67', '456']) ),'FINANCE').otherwise(df.tran_sponsor)).show()

The following exception i am encountering. Can you give some suggestion?

File "", line 1 df.select(df.card_key,F.when((df.tran_sponsor = 'GAMES') & (df.location_code = '9145'),'ENTERTAINMENT').when((df.tran_sponsor = 'XYZ') & (df.location_code = '123'),'eBOOKS').when((df.tran_sponsor = 'XYZ') & (df.l_code.isin(['6001', '6002', '6003', '6004', '6005', '6006', '6007', '6008', '6009', '6010', '6011', '6012', '6013', '6014'])),'FINANCE').otherwise(df.tran_sponsor)).show() ^ SyntaxError: invalid syntax

Upvotes: 1

Views: 2014

Answers (1)

Pranav Waila
Pranav Waila

Reputation: 385

Well i just figured out, there is not problem with isin thing the problem was with assignment operator :(

df.select(df.card_key,F.when((df.tran_sponsor == 'GAMES') &  (df.location_code == '9145'),'ENTERTAINMENT').when((df.tran_sponsor == 'XYZ') &  (df.location_code == '123'),'eBOOKS').when((df.tran_sponsor == 'XYZ') &  (df.l_code.isin(['123', '234', '345', '456', '567', '678', '789', '7878', '67', '456']) ),'FINANCE').otherwise(df.tran_sponsor)).show()

It works well, thanks for efforts if someone was looking into it.

Upvotes: 2

Related Questions