Reputation: 148524
I read about swift floating point types which are :
The Swift standard library provides three signed floating-point number >types: Float for 32-bit floating-point numbers, Double for 64-bit floating >point numbers, and Float80 for extended-precision 80-bit floating-point > numbers.
But I didn't see a "base 10" decimal type. and I think it is represented in base 2 because :
let g1 = (Double(0.1)+Double(0.2) );
println(g1==Double(0.3)) //false
Question:
How can I make this to evaluate to true ?
Upvotes: 1
Views: 583
Reputation: 299345
Swift has the same base-10 numerical type as ObjC has had for years: NSDecimalNumber.
One nice change in Swift is that you can, if you choose, add operator overloads to simplify its usage.
import Foundation
public func +(lhs: NSDecimalNumber, rhs: NSDecimalNumber) -> NSDecimalNumber {
return lhs.decimalNumberByAdding(rhs)
}
let g1 = (NSDecimalNumber(double:0.1) + NSDecimalNumber(double:0.2) );
println(g1==NSDecimalNumber(double:0.3)) // true
If you want to have even more certainty of avoiding rounding errors (there's a possible rounding error when you take a value through the implicit double-convertion to get it to NSDecimalNumber(double:)
), you can use the more explicit, integer-based interface that cannot suffer rounding:
let point1 = NSDecimalNumber(mantissa: 1, exponent: -1, isNegative: false);
let point2 = NSDecimalNumber(mantissa: 2, exponent: -1, isNegative: false);
You may be tempted to use NSDecimalNumber(string:)
, and it's useful, but you have to be very careful about locales. You probably want to use systemLocale()
(which is really a "fallback" locale with fixed settings) to avoid comma-vs-dot confusion:
let g2 = NSDecimalNumber(string: "0.1", locale:NSLocale.systemLocale())
But I'd avoid strings here and use the other techniques that are safer.
Upvotes: 4
Reputation: 6862
Honestly? There's just no clean solution.
Others have written a fuzzyEquals
method that compares only to the nth decimal place.
Here's a similar question.
Upvotes: 0