Shankar
Shankar

Reputation: 8957

Spark SQL sum function issues on double value

We are trying to sum the double values using Spark SQL sum function.

Sample Data:

+------+
|amount|
+------+
|  1000|
|  1050|
|  2049|
+------+

sample Code:

df.select("amount").show();
df.registerTempTable("table");
sqlContext.sql("select amount/pow(10,2) from table").show();
sqlContext.sql("select sum((amount/pow(10,2))) from table").show();

After precision:

+-----+
|  _c0|
+-----+
| 10.0|
| 10.5|
|20.49|
+-----+

Output After Sum:

+------------------+
|               _c0|
+------------------+
|40.989999999999995|
+------------------+

The exected output is 40.99 , but why its giving output as 40.989999999999995.

Appreciate any help on this.

Upvotes: 4

Views: 7105

Answers (1)

Boggio
Boggio

Reputation: 1148

Make sure the sum is evaluated as decimal (Spark SQL mapping).

eg: select sum( cast(amount as decimal) / cast(pow(10,2) as decimal) ) from table

I would recommend you convert amount to be of type decimal.

Upvotes: 4

Related Questions