Reputation: 879
The text "Welcome my application..❣️" does not make sense during the NSRange
and Range
tests. If ❣️ is included, Range is returned as nil
, and I wonder why.
func testA() {
let testStr = "Welcome my application..❣️"
let range = NSRange(location: 0, length: testStr.count)
let wrapRange = Range(range, in: testStr)
let testStrB = "Welcome my application.."
let rangeB = NSRange(location: 0, length: testStrB.count)
let wrapRangeB = Range(rangeB, in: testStrB)
print("wrapRange: \(wrapRange) wrapRangeB: \(wrapRangeB)")
}
RESULT:
wrapRange: nil wrapRangeB: Optional(Range(Swift.String.Index(_rawBits: 1)..<Swift.String.Index(_rawBits: 1572864)))
Upvotes: 4
Views: 777
Reputation: 540105
"❣️" is a single “extended grapheme cluster”, but two UTF-16 code units:
print("❣️".count) // 1
print("❣️".utf16.count) // 2
NSRange
counts UTF-16 code units (which are the “characters” in an NSString
) , therefore the correct way to create an NSRange
comprising the complete range of a Swift string is
let range = NSRange(location: 0, length: testStr.utf16.count)
or better (since Swift 4):
let range = NSRange(testStr.startIndex..., in: testStr)
Explanation: In your code (simplified here)
let testStr = "❣️"
let range = NSRange(location: 0, length: testStr.count)
print(range) // {0, 1}
creates an NSRange
describing a single UTF-16 code unit. This cannot be converted to a Range<String.Index>
in testStr
because its first Character
consists of two UTF-16 code units:
let wrapRange = Range(range, in: testStr)
print(wrapRange) // nil
Upvotes: 8