Reputation: 3192
This one really has me mystified.
I have a dictionary where I track counts of errors and warnings.
Well in an if (...)
block, also pictured, I'm testing for 0 errors, and for some reason, every combination of code I've tried says that errors count > 0 is true.
Here's some results from the LLDB console...
(lldb) po violationCounts[@"errors"]
0
(lldb) po [violationCounts[@"errors"] class]
__NSCFNumber
(lldb) p violationCounts[@"errors"] > 0 (bool)
$2 = true
(lldb) p ((int)violationCounts[@"errors"]) > 0 (bool)
$3 = true
Why is this evaluating to violationsCounts is > 0
Upvotes: 1
Views: 45
Reputation: 112857
The value of violationCounts[@"errors"]
is an NSNunber
so it must be converted to a integer, not cast:
if ([violationCounts[@"errors"] integerValue] > 0)
All values in NSDictionaries
are be objects.
To print it in the deugger it is not necessary to convert it because the description method will be called and that will convert it for display. But it is necessary to use "po" (print object) instead of "p" (print):
po violationCounts[@"errors"]
Upvotes: 1
Reputation: 21137
Since it is saved as NSNumber
, It should be :
po [violationCounts[@"errors"] integerValue] > 0
Upvotes: 2