Reputation: 17054
I saw the "new type" BOOL
(YES
, NO
).
I read that this type is almost like a char.
For testing I did :
NSLog(@"Size of BOOL %d", sizeof(BOOL));
NSLog(@"Size of bool %d", sizeof(bool));
Good to see that both logs display "1" (sometimes in C++ bool is an int and its sizeof is 4)
So I was just wondering if there were some issues with the bool type or something ?
Can I just use bool (that seems to work) without losing speed?
Upvotes: 201
Views: 206799
Reputation: 18909
The BOOL
Objective-C type is in fact either a type alias to bool
(from C's stdbool.h
) or to signed char
. In case you are curious, turns out which one is chosen where is rather complicated, and depends on the target platform and architecture.
While the Objective-C runtime is capable of making its own decisions about it, at least for Darwin, LLVM is always explicit on which one to choose:
BOOL
is always a type alias to bool
BOOL
is a signed char
for all platforms except for watchOS (where it is bool
)BOOL
is a signed char
for all platforms except for watchOS (where it is bool
)BOOL
is a signed char
for all platforms except for iOS (where it is bool
)You can read more about my investigation here: https://www.jviotti.com/2024/01/05/is-objective-c-bool-a-boolean-type-it-depends.html.
Upvotes: 1
Reputation: 342
Also, be aware of differences in casting, especially when working with bitmasks, due to casting to signed char:
bool a = 0x0100;
a == true; // expression true
BOOL b = 0x0100;
b == false; // expression true on !((TARGET_OS_IPHONE && __LP64__) || TARGET_OS_WATCH), e.g. MacOS
b == true; // expression true on (TARGET_OS_IPHONE && __LP64__) || TARGET_OS_WATCH
If BOOL is a signed char instead of a bool, the cast of 0x0100 to BOOL simply drops the set bit, and the resulting value is 0.
Upvotes: 1
Reputation: 844
As mentioned above BOOL
could be an unsigned char
type depending on your architecture, while bool
is of type int
. A simple experiment will show the difference why BOOL and bool can behave differently:
bool ansicBool = 64;
if(ansicBool != true) printf("This will not print\n");
printf("Any given vlaue other than 0 to ansicBool is evaluated to %i\n", ansicBool);
BOOL objcBOOL = 64;
if(objcBOOL != YES) printf("This might print depnding on your architecture\n");
printf("BOOL will keep whatever value you assign it: %i\n", objcBOOL);
if(!objcBOOL) printf("This will not print\n");
printf("! operator will zero objcBOOL %i\n", !objcBOOL);
if(!!objcBOOL) printf("!! will evaluate objcBOOL value to %i\n", !!objcBOOL);
To your surprise if(objcBOOL != YES)
will evaluates to 1 by the compiler, since YES
is actually the character code 1, and in the eyes of compiler, character code 64 is of course not equal to character code 1 thus the if statement will evaluate to YES/true/1
and the following line will run.
However since a none zero bool
type always evaluates to the integer value of 1, the above issue will not effect your code. Below are some good tips if you want to use the Objective-C BOOL
type vs the ANSI C bool
type:
YES
or NO
value and nothing else.BOOL
types by using double not !!
operator to avoid unexpected results.YES
use if(!myBool) instead of if(myBool != YES)
it is much cleaner to use the not !
operator and gives the expected result.Upvotes: 2
Reputation: 666
The accepted answer has been edited and its explanation become a bit incorrect. Code sample has been refreshed, but the text below stays the same. You cannot assume that BOOL is a char for now since it depends on architecture and platform. Thus, if you run you code at 32bit platform(for example iPhone 5) and print @encode(BOOL) you will see "c". It corresponds to a char type. But if you run you code at iPhone 5s(64 bit) you will see "B". It corresponds to a bool type.
Upvotes: 2
Reputation: 107754
From the definition in objc.h
:
#if (TARGET_OS_IPHONE && __LP64__) || TARGET_OS_WATCH
typedef bool BOOL;
#else
typedef signed char BOOL;
// BOOL is explicitly signed so @encode(BOOL) == "c" rather than "C"
// even if -funsigned-char is used.
#endif
#define YES ((BOOL)1)
#define NO ((BOOL)0)
So, yes, you can assume that BOOL is a char. You can use the (C99) bool
type, but all of Apple's Objective-C frameworks and most Objective-C/Cocoa code uses BOOL, so you'll save yourself headache if the typedef ever changes by just using BOOL.
Upvotes: 209
Reputation: 1406
At the time of writing this is the most recent version of objc.h:
/// Type to represent a boolean value.
#if (TARGET_OS_IPHONE && __LP64__) || TARGET_OS_WATCH
#define OBJC_BOOL_IS_BOOL 1
typedef bool BOOL;
#else
#define OBJC_BOOL_IS_CHAR 1
typedef signed char BOOL;
// BOOL is explicitly signed so @encode(BOOL) == "c" rather than "C"
// even if -funsigned-char is used.
#endif
It means that on 64-bit iOS devices and on WatchOS BOOL
is exactly the same thing as bool
while on all other devices (OS X, 32-bit iOS) it is signed char
and cannot even be overridden by compiler flag -funsigned-char
It also means that this example code will run differently on different platforms (tested it myself):
int myValue = 256;
BOOL myBool = myValue;
if (myBool) {
printf("i'm 64-bit iOS");
} else {
printf("i'm 32-bit iOS");
}
BTW never assign things like array.count
to BOOL
variable because about 0.4% of possible values will be negative.
Upvotes: 13
Reputation: 4044
Another difference between bool and BOOL is that they do not convert exactly to the same kind of objects, when you do key-value observing, or when you use methods like -[NSObject valueForKey:].
As everybody has said here, BOOL is char. As such, it is converted to an NSNumber holding a char. This object is indistinguishable from an NSNumber created from a regular char like 'A' or '\0'. You have totally lost the information that you originally had a BOOL.
However, bool is converted to an CFBoolean, which behaves the same as NSNumber, but which retains the boolean origin of the object.
I do not think that this is an argument in a BOOL vs. bool debate, but this may bite you one day.
Generally speaking, you should go with BOOL, since this is the type used everywhere in the Cocoa/iOS APIs (designed before C99 and its native bool type).
Upvotes: 4
Reputation: 29767
As mentioned above, BOOL is a signed char. bool - type from C99 standard (int).
BOOL - YES/NO. bool - true/false.
See examples:
bool b1 = 2;
if (b1) printf("REAL b1 \n");
if (b1 != true) printf("NOT REAL b1 \n");
BOOL b2 = 2;
if (b2) printf("REAL b2 \n");
if (b2 != YES) printf("NOT REAL b2 \n");
And result is
REAL b1
REAL b2
NOT REAL b2
Note that bool != BOOL. Result below is only ONCE AGAIN - REAL b2
b2 = b1;
if (b2) printf("ONCE AGAIN - REAL b2 \n");
if (b2 != true) printf("ONCE AGAIN - NOT REAL b2 \n");
If you want to convert bool to BOOL you should use next code
BOOL b22 = b1 ? YES : NO; //and back - bool b11 = b2 ? true : false;
So, in our case:
BOOL b22 = b1 ? 2 : NO;
if (b22) printf("ONCE AGAIN MORE - REAL b22 \n");
if (b22 != YES) printf("ONCE AGAIN MORE- NOT REAL b22 \n");
And so.. what we get now? :-)
Upvotes: 36
Reputation: 14441
I go against convention here. I don't like typedef's to base types. I think it's a useless indirection that removes value.
Upvotes: 1
Reputation: 126085
The Objective-C type you should use is BOOL
. There is nothing like a native boolean datatype, therefore to be sure that the code compiles on all compilers use BOOL
. (It's defined in the Apple-Frameworks.
Upvotes: 8
Reputation: 21892
Yup, BOOL is a typedef for a signed char according to objc.h.
I don't know about bool, though. That's a C++ thing, right? If it's defined as a signed char where 1 is YES/true and 0 is NO/false, then I imagine it doesn't matter which one you use.
Since BOOL is part of Objective-C, though, it probably makes more sense to use a BOOL for clarity (other Objective-C developers might be puzzled if they see a bool in use).
Upvotes: 5