Reputation: 41
We are facing a peculiar problem . We have to store some data in Hive first then copy the same data to Oracle using spark 3.x jdbc option. We have one column data whose value is "aaaabbbbbb". While enabling charVarchar as true and in Hive with varchar (4) its storing as "aaaa" but same thing in Oracle with same column length failing as "Value too large expected 9 max 4" . It was working perfectly in Spark 2.x
Tried with but not working.
spark.conf.set("spark.sql.storeAssignmentPolicy","LEGACY") spark.conf.set("spark.sql.legacy.charVarcharAsString", value = true)
Any idea will be appreciated
Upvotes: 0
Views: 224