Alex
Alex

Reputation: 47

Spark SQL: ALTER TABLE CHANGE

I want to import a few tables into a DB which (at the moment) doesn't support decimal precision greater than 18. So i tried ALTER TABLE in Spark like this to lower the precision. It isn't important how it's rounded.

rt_rel_acc_acc.saveAsTable("rt_rel_acc_acc", SaveMode.Overwrite)
hiveContext.sql("ALTER TABLE rt_rel_acc_acc CHANGE REL001 REL001 decimal(18, 9)")
hiveContext.read.table("rt_rel_acc_acc").write.jdbc(url, "rt_rel_acc_acc", properties)

However Spark doesn't seem to apply the changes at all, since an Exception is thrown at the third line:

PSQLException: ERROR: invalid numeric size '30,15'

ALTER TABLE x RENAME TO y is applied immediately on the other hand.

Any idea, what's wrong with my approach?

Upvotes: 2

Views: 7251

Answers (1)

Srinivasarao Daruna
Srinivasarao Daruna

Reputation: 3374

You can cast any column in the Dataframe. Just like the below.

val df2 = df.withColumn("New_Column_Name", 
              df.col("Actual_Column_name").cast("Decimal(18, 9)"))

In this case, you will have both old and new column names. You can remove or select columns and then apply saveAsTable or use it for other tables.

Let me know, if this works. If my understanding about the problem is wrong, i would be happy to try other solutions based on your response.

Thank You, Regards, Srini.

Upvotes: 1

Related Questions