Reputation: 1627
I'm working on a mixed objc and swift project, and I've noticed that when I have an Int
property in a Swift class that has a value 0 when I read this value from Objc code it returns nil
.
A Swift class with an integer property that is visible from ObjC:
@objc
class SwiftInt: NSObject {
@objc let testInt: Int = 0
}
Now when I read this property in an Objc code, it says that testInt
is nil
:
- (void)viewDidLoad {
[super viewDidLoad];
SwiftInt *swiftInt = [SwiftInt new];
if (swiftInt.testInt == nil) {
NSLog(@"This shouldn't be nil");
}
}
If I set any other number than 0, that value is correctly returned, but for 0 it returns nil. The question is why is this nil
in Objc when Int
is a primitive type and it's a non optional? I'm using Swift 4.2.
Upvotes: 3
Views: 816
Reputation: 2875
Because in Objective-C nil
defined as __DARWIN_NULL
:
#ifndef nil
# if __has_feature(cxx_nullptr)
# define nil nullptr
# else
# define nil __DARWIN_NULL
# endif
#endif
Which defined as (void *)0
in Obj-C
So your code:
if (swiftInt.testInt == nil) { ... }
if (0 == nil) { ... }
if (0 == 0) { ... }
Which is always true
Good article about nil
in Obj-C http://benford.me/blog/the-macro-behind-nil/
Upvotes: 7