Brian
Brian

Reputation: 8085

Freehand drawing over UIImage

I want to allow a user to draw over an photo. I'm currently following this tutorial that shows how to make a simple drawing app using UIKit.

There are two UIImageViews (tempImageView and mainImageView) in the view controller. When the user drags their finger they draw directly on tempImageView, and when they release, the image of tempImageView and mainImageView are merged together to create the main image.

Right now, my app behaves like this:

Example

When I begin (left screenshot) I set mainImageView to have an existing photo that I want to allow the user to draw over. The problem is, as soon as the user draws a stroke and releases their finger, the original photo in the background becomes stretched (right screenshot). It happens during the merging between tempImageView and mainImageView. How can I fix this?

I have tried setting the content mode (aspect fill, center, etc.) of the image view and that did not change anything.

The view controller is created using Storyboard and simply has two UIImageViews on top of each other. Here is the entire FingerpaintViewController code:

import UIKit

class FingerpaintViewController: UIViewController {

    @IBOutlet weak var mainImageView: UIImageView!
    @IBOutlet weak var tempImageView: UIImageView!

    private var lastPoint = CGPoint.zero
    private var red: CGFloat = 0.0
    private var green: CGFloat = 0.0
    private var blue: CGFloat = 0.0
    private var brushWidth: CGFloat = 10.0
    private var opacity: CGFloat = 1.0
    private var swiped = false

    override func viewDidLoad() {
        super.viewDidLoad()

        mainImageView.image = UIImage(named: "golden_gate_bridge")
    }

    override func touchesBegan(touches: Set<UITouch>, withEvent event: UIEvent?) {
        swiped = false
        if let touch = touches.first {
            lastPoint = touch.locationInView(tempImageView)
        }
    }

    override func touchesMoved(touches: Set<UITouch>, withEvent event: UIEvent?) {
        swiped = true
        if let touch = touches.first {
            let currentPoint = touch.locationInView(tempImageView)
            drawLineFrom(lastPoint, toPoint: currentPoint)

            lastPoint = currentPoint
        }
    }

    override func touchesEnded(touches: Set<UITouch>, withEvent event: UIEvent?) {
        if !swiped {
            drawLineFrom(lastPoint, toPoint: lastPoint)
        }

        // Merge tempImageView into mainImageView
        UIGraphicsBeginImageContext(mainImageView.frame.size)
        mainImageView.image?.drawInRect(CGRect(x: 0, y: 0, width: mainImageView.frame.size.width, height: mainImageView.frame.size.height), blendMode: .Normal, alpha: 1.0)
        tempImageView.image?.drawInRect(CGRect(x: 0, y: 0, width: mainImageView.frame.size.width, height: mainImageView.frame.size.height), blendMode: .Normal, alpha: opacity)
        mainImageView.image = UIGraphicsGetImageFromCurrentImageContext()
        UIGraphicsEndImageContext()

        tempImageView.image = nil
    }

    func drawLineFrom(fromPoint: CGPoint, toPoint: CGPoint) {
        UIGraphicsBeginImageContext(mainImageView.frame.size)
        let context = UIGraphicsGetCurrentContext()
        tempImageView.image?.drawInRect(CGRect(x: 0, y: 0, width: mainImageView.frame.size.width, height: mainImageView.frame.size.height))

        // 2
        CGContextMoveToPoint(context, fromPoint.x, fromPoint.y)
        CGContextAddLineToPoint(context, toPoint.x, toPoint.y)

        // 3
        CGContextSetLineCap(context, .Round)
        CGContextSetLineWidth(context, brushWidth)
        CGContextSetRGBStrokeColor(context, red, green, blue, 1.0)
        CGContextSetBlendMode(context, .Normal)

        // 4
        CGContextStrokePath(context)

        // 5
        tempImageView.image = UIGraphicsGetImageFromCurrentImageContext()
        tempImageView.alpha = opacity
        UIGraphicsEndImageContext()

    }
}

Upvotes: 2

Views: 2433

Answers (2)

Mikkel Sels&#248;e
Mikkel Sels&#248;e

Reputation: 1201

There's an API in AVFoundation that solves this exact issue of fitting an image into a CGRect while preserving aspect ratio:

AVMakeRect(aspectRatio: image.size, insideRect: view.bounds)

Upvotes: 0

Rory O&#39;Logan
Rory O&#39;Logan

Reputation: 181

I think the problem is with setting the mainImageView.image.drawinrect to the size of the imageview and not the displayed image.

I don't use swift but heres what I would use in Obj-c to get the displayed size of the image to use for drawinrect size.

CGRect imageRect = [self getImageRectForImageView:mainImageView];

- (CGRect) getImageRectForImageView:(UIImageView*)imageView
{
    float resVi = imageView.image.size.width / imageView.image.size.height;
    float resPl = imageView.size.width / imageView.size.height;

    if (resPl > resVi)
    {
        CGSize imageSize = CGSizeMake(imageView.image.size.width * imageView.size.height / imageView.image.size.height, imageView.size.height);
        return CGRectMake((imageView.size.width - imageSize.width)/2,
                          (imageView.size.height - imageSize.height)/2,
                          imageSize.width,
                          imageSize.height);

    }
    else
    {
        CGSize imageSize = CGSizeMake(imageView.size.width, imageView.image.size.height * imageView.size.width / imageView.image.size.width);
        return CGRectMake((imageView.size.width - imageSize.width)/2,
                          (imageView.size.height - imageSize.height)/2,
                          imageSize.width,
                          imageSize.height);
    }
}

Then you can use the new imageRect value for your drawinrect call.

// Merge tempImageView into mainImageView
UIGraphicsBeginImageContext(mainImageView.frame.size)
mainImageView.image?.drawInRect(imageRect);
...

Upvotes: 2

Related Questions