Reputation: 13
In iOS 13 and low I got Font.Weight using this extension for UIFont
var weight: UIFont.Weight {
guard let traits = fontDescriptor.object(forKey: .traits) as? [UIFontDescriptor.TraitKey: Any], let weightValue = traits[.weight] as? CGFloat else { return .regular }
let weight = UIFont.Weight(rawValue: weightValue)
return weight
}
But since iOS 14 weightValue is wrong. For example:
let font = UIFont(name: "AvenirNext-Bold", size: 21)
print(font?.weight.rawValue)
print(font?.weight == .bold)
iOS 14 - 0.40000000000000002 false
iOS 13 - 0.40000000596046448 true
Anybody faced with it?
Upvotes: 1
Views: 806
Reputation: 9935
This is not a bug. It is in fact expected to be inaccurate after a high number of decimal places (16 in this case). Floats are calculated via the IEEE 754 standard for Single-Precision Floating-Point arithmetic. The original standard was only accurate up to 7 decimal places for integers between 4 and 6, but this has iteratively improved since the original, for example in 2019 and 2020 (see here for details).
What you see is probably the effect of conditional casting. The conversion from Float
to CGFloat
loses precision.
Upvotes: 1