Reputation: 40735
I want to scale up an UIImage in such a way, that the user can see the pixels in the UIImage very sharp. When I put that to an UIImageView and scale the transform matrix up, the UIImage appears antialiased and smoothed.
Is there a way to render in a bigger bitmap context by simply repeating every row and every column to get bigger pixels? How could I do that?
Upvotes: 8
Views: 5824
Reputation: 977
Swift 5
let image = UIImage(named: "Foo")
let scaledImageSize = image.size.applying(CGAffineTransform(scaleX: 2, y: 2))
UIGraphicsBeginImageContext(scaledImageSize)
let scaledContext = UIGraphicsGetCurrentContext()!
scaledContext.interpolationQuality = .none
image.draw(in: CGRect(origin: .zero, size: scaledImageSize))
let scaledImage = UIGraphicsGetImageFromCurrentImageContext()!
Upvotes: 2
Reputation: 553
When drawing directly into bitmap context, we can use:
CGContextSetInterpolationQuality(myBitmapContext, kCGInterpolationNone);
I found this on CGContextDrawImage very slow on iPhone 4
Upvotes: 2
Reputation: 948
For UIImage created from CIImage you may use:
imageView.image = UIImage(CIImage: ciImage.imageByApplyingTransform(CGAffineTransformMakeScale(kScale, kScale)))
Upvotes: -1
Reputation: 5659
I was also trying this (on a sublayer) and I couldn't get it working, it was still blurry. This is what I had to do:
const CGFloat PIXEL_SCALE = 2;
layer.magnificationFilter = kCAFilterNearest; //Nearest neighbor texture filtering
layer.transform = CATransform3DMakeScale(PIXEL_SCALE, PIXEL_SCALE, 1); //Scale layer up
//Rasterize w/ sufficient resolution to show sharp pixels
layer.shouldRasterize = YES;
layer.rasterizationScale = PIXEL_SCALE;
Upvotes: 1
Reputation: 18333
#import <QuartzCore/CALayer.h>
view.layer.magnificationFilter = kCAFilterNearest
Upvotes: 26