Jhon
Jhon

Reputation: 49

Dataframe string to Hive table Bigint - How to convert

Spark: 1.6, Scala, Hive

I have a dataframe DF.printschema

root
 |-- rundatetime: string (nullable = true)
 |-- day_cunt: String (nullable = true)
 |-- my_key: integer (nullable = true)

DF.show()

rundatetime             |day_cunt | my_key
2017-04-21 11:00:06     | 5       |10
2017-04-21 12:10:06     | 15      |1000

My Hive table is

rundatetime String,
day_cunt    BigInt,
my_key      Int
Stored as Parquet;

How can I save dataframe value to Hive table? Please note DF and hive table Datatype are different.

Upvotes: 0

Views: 3346

Answers (1)

Leo C
Leo C

Reputation: 22439

BigInt isn't a supported data type for Spark DataFrames.

You can create an intermediary dataframe by casting your day_count to Long:

val newDF = df.select($"rundatetime", $"day_count".cast("Long"), $"my_key")

Casting with cast("BigInt") won't throw errors but will in effect just cast to the Long data type.

Upvotes: 1

Related Questions