Reputation: 551
I have the following pyspark.DataFrame
+---+--------+--------+--------------+
|SEX|_AGEG5YR|_IMPRACE| _LLCPWT|
+---+--------+--------+--------------+
| 2| 11.0| 1.0| 79.4259469451|
| 2| 10.0| 1.0| 82.1648291655|
| 2| 11.0| 2.0| 55.7851100058|
| 2| 13.0| 1.0|115.9818718258|
| 2| 12.0| 1.0|194.7566575195|
+---+--------+--------+--------------+
I want to create a new column based on SEX column
As suggested by this previous answer, I have defined a MapType
literal as follow
brfss_mapping = {
"SEX": {
1: "Male",
2: "Female",
9: "Refused"
}
}
brfss_sex_mapping = create_map(
[lit(x) for x in chain(*brfss_mapping["SEX"].items())]
)
Now, when I use withColumn
and brfss_sex_mapping.getItem(...)
with constant value such as below
brfss_dmy = brfss_dmy.withColumn(
"SEX_2",
brfss_sex_mapping.getItem(1)
)
I get the expected result
+---+--------+--------+--------------+-----+
|SEX|_AGEG5YR|_IMPRACE| _LLCPWT|SEX_2|
+---+--------+--------+--------------+-----+
| 1| 13.0| 1.0|381.8001043164| Male|
| 2| 10.0| 1.0| 82.1648291655| Male|
| 1| 11.0| 1.0|279.1864457296| Male|
| 1| 10.0| 1.0| 439.024136158| Male|
| 2| 8.0| 1.0| 372.921644978| Male|
+---+--------+--------+--------------+-----+
However, when I try to pass the appropriate column as follow (again, as it is suggested in the previous answer)
brfss_dmy = brfss_dmy.withColumn(
"SEX_2",
brfss_sex_mapping.getItem(col("SEX"))
)
I get the following
java.lang.RuntimeException: Unsupported literal type class org.apache.spark.sql.Column SEX
Upvotes: 0
Views: 548
Reputation: 32700
It appears that in Spark 3.0, we can no longer pass a column to getItem
function but I couldn't find any reference in the code or documentation.
You can use element_at
instead:
df.withColumn("SEX_2", element_at(brfss_sex_mapping, col("SEX")).show()
Or access value as an array :
df.withColumn("SEX_2", brfss_sex_mapping[col("SEX")]).show()
In Scala:
df.withColumn("SEX_2", element_at(brfss_sex_mapping, $"SEX")).show()
df.withColumn("SEX_2", brfss_sex_mapping($"SEX")).show()
Upvotes: 1