Reputation: 6952
Please look at the code below and the result in console.
NSString *strRatio = @"0.03" ;
float f = [strRatio floatValue] ;
NSLog(@"%f, %@", f, f == 0.03 ? @"equal" : @"not equal") ;
result:
0.030000, not equal
Also I have a screenshot when I add a breakpoint at NSLog(@"%f, %@", f, f == 0.03 ? @"equal" : @"not equal") ;
, it gives me a different value of f
showing 0.0299999993...
Can anyone explain it ?
f == 0.03
is false ?Edit :
I expect that the value of f is 0.03 after converting from @"0.03", how can I achieve it ?
It seems that float can't represent 0.03. Even if I assign 0.03 to float value forcibly, I will get 0.029999993 as the result.
Upvotes: 2
Views: 648
Reputation: 2435
Try NSDecimalNumber instead of [string floatValue];
NSDecimalNumber *number1 = [NSDecimalNumber decimalNumberWithString:@"0.03"];
NSLog(@"number1: %@", number1); //0.03
Upvotes: 1
Reputation: 318914
The value is not 0.03
, it is as shown in the debugger - 0.0299999993
.
It shows as 0.03000
in the log because by default, %f
shows 5 decimal places so the value 0.0299999993
is being rounded to 0.03000
.
Change the log to use %.10f
and you will see the real value.
Upvotes: 1