yyuankm
yyuankm

Reputation: 365

Spark scala how to convert a Integer column in dataframe to hex uppercase string?

We can use the following function to covert a single integer value.

val x=100
Integer.toString(x, 16).toUpperCase

But how to apply it to a integer column to generate a new column with hex string? thanks!

The below method does not work.

testDF = testDF.withColumn("data_hex_string", Integer.toString(testDF("data"), 16).toUpperCase)

Upvotes: 0

Views: 1255

Answers (2)

Mark Rajcok
Mark Rajcok

Reputation: 364697

As @jxc already mentioned in a comment, use the conv function:

import org.apache.spark.sql.functions.{conv, lower}
df.withColumn("hex", conv($"int_col",10,16)).show

For those who want lowercase, wrap it with lower:

df.withColumn("hex", lower(conv($"int_col",10,16))).show

Upvotes: 2

Lamanus
Lamanus

Reputation: 13551

AFAIK, there isn't a spark native function, so make a udf function to do that.

import org.apache.spark.sql.functions.udf
def toHex = udf((int: Int) => java.lang.Integer.toString(int, 16).toUpperCase)

df.withColumn("hex", toHex($"int")).show()

+---+---+---+
| id|int|hex|
+---+---+---+
|  1|  1|  1|
|  2| 11|  B|
|  3| 23| 17|
+---+---+---+

Upvotes: 2

Related Questions