Edamame
Edamame

Reputation: 25416

pyspark: merge (outer-join) two data frames

I have the following two data frames:

DF1:

    Id | field_A | field_B | field_C | field_D
     1 |   cat   |  12     |   black | 11
     2 |   dog   | 128     |   white | 19
     3 |   dog   |  35     |  yellow | 20
     4 |   dog   |  21     |   brown |  4
     5 |  bird   |  10     |    blue |  7
     6 |   cow   |  99     |   brown | 34

and

DF2:

    Id | field_B | field_C | field_D | field_E
     3 |  35     |  yellow | 20      |   123   
     5 |  10     |    blue |  7      |   454   
     6 |  99     |   brown | 34      |   398   

And I am hoping to get the new_DF as

    Id | field_A | field_B | field_C | field_D | field_E
     1 |   cat   |  12     |   black | 11      |
     2 |   dog   | 128     |   white | 19      |
     3 |   dog   |  35     |  yellow | 20      |  123
     4 |   dog   |  21     |   brown |  4      |  
     5 |  bird   |  10     |    blue |  7      |  454
     6 |   cow   |  99     |   brown | 34      |  398

Could this be achieved by data frame operations? Thanks!

Upvotes: 20

Views: 58671

Answers (1)

MaxU - stand with Ukraine
MaxU - stand with Ukraine

Reputation: 210982

try this:

new_df = df1.join(df2, on=['field_B', 'field_C', 'field_D'], how='left_outer')

Upvotes: 36

Related Questions