ibaralf
ibaralf

Reputation: 12528

Spark Decimal Precision and Scale seems wrong when Casting

Reading the documentation, a Spark DataType BigDecimal(precision, scale) means that

So when I cast a value to decimal

scala> val sss = """select cast(1.7142857343 as decimal(9,8))"""
scala> spark.sql(sss).show
+----------------------------------+
|CAST(1.7142857343 AS DECIMAL(9,8))|
+----------------------------------+
|                        1.71428573|  // It has 8 decimal digits
+----------------------------------+

But when I cast values above 10.0, I get NULL

scala> val sss = """select cast(12.345678901 as decimal(9,8))"""
scala> spark.sql(sss).show
+----------------------------+
|CAST(11.714 AS DECIMAL(9,8))|
+----------------------------+
|                        null|
+----------------------------+

I would expect the result would be 12.3456789,

Upvotes: 5

Views: 14478

Answers (1)

hagarwal
hagarwal

Reputation: 1163

To cast decimal spark internally validates that provided schema decimal(9,8) is wider than 12.345678901 actual schema decimal(11,9). If yes, it means numbers can be casted into provided schema safely without losing any precision or range. Have a look at org.apache.spark.sql.types.DecimalType.isWiderThan()

However, in the above case decimal(11,9) can not be cast into decimal(9,8) therefore it is returning null.

//MAX_PRECISION = 38
val sss = """select cast(12.345678901 as decimal(38,7))"""
spark.sql(sss1).show(10)
+-----------------------------------+
|CAST(12.345678901 AS DECIMAL(38,8))|
+-----------------------------------+
|                         12.3456789|
+-----------------------------------+

Upvotes: 6

Related Questions