Reputation: 75
I am attempting to draw rectangles (array of CGRects) on a UIImage which will be rendered back to the end-user. The final result, for lack of a better phrase, just looks weird. For example, if I have an array of 2 CGRect's, only one is drawn, barely, but not fully complete. The other is omitted. Please see image below for how the image turns out with a particular array of CGRects. (FYI: I am sending the image to Google Cloud for processing, which responds with an array of CGRects)
What am I missing?
private func drawOccurrencesOnImage(_ occurrences: [CGRect], _ image: UIImage) -> UIImage? {
let imageSize = image.size
let scale: CGFloat = 0.0
UIGraphicsBeginImageContextWithOptions(imageSize, false, scale)
image.draw(at: CGPoint.zero)
let ctx = UIGraphicsGetCurrentContext()
ctx?.addRects(occurrences)
ctx?.setStrokeColor(UIColor.red.cgColor)
ctx?.setLineWidth(2.0)
ctx?.strokePath()
guard let drawnImage = UIGraphicsGetImageFromCurrentImageContext() else {
presentAlert(alertTitle: "Error", alertText: "There was an issue, please try again")
return nil
}
UIGraphicsEndImageContext()
return drawnImage
}
CGRect array ("occurrences" from function above) with a value of "[(298.0, 868.0, 65.0, 43.0), (464.0, 1017.0, 67.0, 36.0)]" results in image below:
Further details: using Swift 4, iOS 12
Thank you!
Upvotes: 1
Views: 2227
Reputation: 4008
Code is right .
"[(298.0, 868.0, 65.0, 43.0), (464.0, 1017.0, 67.0, 36.0)]"
Coordinates are not proper.
Please consider the iPhone Resolutions.
imageView.image = drawOccurrencesOnImage([CGRect(x: 100, y: 100, width: 59, height: 60), CGRect(x: 120, y: 50, width: 59, height: 60)], UIImage(named: "person-placeholder")!)
Upvotes: 2