RamG213
RamG213

Reputation: 25

How to convert to BIGINT type in Spark Scala

I applied SHA1 to some data and made it into HASHBYTE data.
And I want to convert HASHBYTE data to BIGINT data.
So I tried as below...

val testDF = Seq("Ox532831f5e2effdcb4cf51e42f05e83f4b45679f3").toDF
testDF.withColumn("big_Int", col("value").cast("bigint")).show(false)
+------------------------------------------+-------+
|value                                     |big_Int|
+------------------------------------------+-------+
|Ox532831f5e2effdcb4cf51e42f05e83f4b45679f3|null   |
+------------------------------------------+-------+

However, a null value was confirmed.
And I checked BIGINT using CAST in T-SQL.

select cast(0x532831F5E2EFFDCB4CF51E42F05E83F4B45679F3 as BIGINT)

Returns : -1126317769775220237

What I ultimately want is...
It is to convert to BIGINT like T-SQL in spark scala.
Thank you very much for your help.

Upvotes: 0

Views: 4450

Answers (1)

Areg Nikoghosyan
Areg Nikoghosyan

Reputation: 521

For large integer you should use LongType:

cabArticleGold.withColumn("CAB", 'CAB.cast(LongType))

or

cabArticleGold.withColumn("CAB", 'CAB.cast("long"))

You can also use DecimalType

cabArticleGold.withColumn("CAB", 'CAB.cast(DecimalType(38, 0)))

or

cabArticleGold.withColumn("CAB", 'CAB.cast("decimal(38, 0)"))

Upvotes: 1

Related Questions