Reputation: 1117
I Have Dataframe like below
+---+---+---+
| t1| t2|t3 |
+---+---+---+
|0 |1 |0 |
+---+---+---+
I want to compare each column with other column.
for example t1
column value 0
and t2
column value is 1
the t1 and t2
combination column is 1
.
we have to apply logical oR
for all column pairs.
my expected output will be like below:
+----+---+---+---+
|t123| t1|t2 | t3|
+----+---+---+---+
|t1 |0 |1 |0 |
|t2 |1 |0 |1 |
|t2 |0 |1 |0 |
+----+---+---+---+
please help me on this.
Upvotes: 0
Views: 531
Reputation: 2200
For pyspark, you can create an empty df then insert it in loop based on columns. Below works not only 3 columns but also for more columns
>>> import pyspark.sql.functions as F
>>>
>>> df1 = spark.createDataFrame(sc.emptyRDD(), df.schema)
>>> df.show()
+---+---+---+
| t1| t2| t3|
+---+---+---+
| 0| 1| 0|
+---+---+---+
>>> df1 = spark.createDataFrame(sc.emptyRDD(), df.schema)
>>> df1 = df1.select(F.lit('').alias('t123'), F.col('*'))
>>> df1.show()
+----+---+---+---+
|t123| t1| t2| t3|
+----+---+---+---+
+----+---+---+---+
>>> for x in df.columns:
... mydf = df.select([(F.when(df[i]+df[x]==1,1).otherwise(0)).alias(i) for i in df.columns])
... df1 = df1.union(mydf.select(F.lit(x).alias('t123'), F.col('*')))
...
>>> df1.show()
+----+---+---+---+
|t123| t1| t2| t3|
+----+---+---+---+
| t1| 0| 1| 0|
| t2| 1| 0| 1|
| t3| 0| 1| 0|
+----+---+---+---+
Upvotes: 0
Reputation: 11192
Try this,
cols=df.columns
n=len(cols)
df1=pd.concat([df]*n,ignore_index=True).eq(1)
df2= pd.concat([df.T]*n,axis=1,ignore_index=True).eq(1)
df2.columns=cols
df2=df2.reset_index(drop=True)
print (df1|df2).astype(int)
Explanation:
Output:
t1 t2 t3
0 0 1 0
1 1 1 1
2 0 1 0
Upvotes: 1