Reputation: 139
I am working in a iOS app to crop the rectangle image from camera. And I am using the CIDetector to get the rect features and using CIFilter in order to crop the rectangle image. But after applying the filter the result image quality becomes very poor.
Here is my code below.
I am getting video capture output from the following delegate method
- (void)captureOutput:(AVCaptureOutput *)captureOutput didOutputSampleBuffer:(CMSampleBufferRef)sampleBuffer fromConnection:(AVCaptureConnection *)connection{
UIImage *image = [self imageFromSampleBuffer:sampleBuffer];
// Convert into CIIImage to find the rectfeatures.
self.sourceImage = [[CIImage alloc] initWithCGImage:[self imageFromSampleBuffer:sampleBuffer].CGImage options:nil];
}
- (UIImage *) imageFromSampleBuffer:(CMSampleBufferRef) sampleBuffer
{
CVPixelBufferRef pb = CMSampleBufferGetImageBuffer(sampleBuffer);
CIImage *ciimg = [CIImage imageWithCVPixelBuffer:pb];
// show result
CIContext *context = [CIContext contextWithOptions:nil];
CGImageRef ref = [context createCGImage:ciimg fromRect:ciimg.extent];
UIImage *image = [UIImage imageWithCGImage:ref scale:1.0 orientation:(UIImageOrientationUp)];
CFRelease(ref);
return (image);
}
And I am running a NSTimer in the background which will start detect rect features from the captured source image for every 0.2 seconds
- (void)performRectangleDetection:(CIImage *)image{
if(image == nil)
return;
NSArray *rectFeatures = [self.rectangleDetector featuresInImage:image];
if ([rectFeatures count] > 0 ) {
[self capturedImage:image];
}
}
-(void)capturedImage:(CIImage *)image
{
NSArray *rectFeatures = [self.rectangleDetector featuresInImage:image];
CIImage *resultImage = [image copy];
for (CIRectangleFeature *feature in rectFeatures) {
resultImage = [image imageByApplyingFilter:@"CIPerspectiveCorrection"
withInputParameters:@{@"inputTopLeft":[CIVector vectorWithCGPoint:feature.topLeft] ,
@"inputTopRight": [CIVector vectorWithCGPoint:feature.topRight],
@"inputBottomLeft": [CIVector vectorWithCGPoint:feature.bottomLeft],
@"inputBottomRight": [CIVector vectorWithCGPoint:feature.bottomRight]}];
}
UIImage *capturedImage = [[UIImage alloc] initWithCIImage: resultImage];
UIImage *finalImage = [self imageWithImage:capturedImage scaledToSize:capturedImage.size];
}
The finalImage will be retrieved after sending to this method
- (UIImage *)imageWithImage:(UIImage *)image scaledToSize:(CGSize)newSize {
UIGraphicsBeginImageContextWithOptions(newSize, NO, 0.0);
[image drawInRect:CGRectMake(0, 0, newSize.width, newSize.height)];
UIImage *newImage = UIGraphicsGetImageFromCurrentImageContext();
UIGraphicsEndImageContext();
return newImage;
}
The final image quality becomes blurred sometimes. Is that because of the filter or because of camera output image ? Pls help me to solve this.
Upvotes: 2
Views: 801
Reputation: 86
use the following method to crop image
-(UIImage*)cropImage:(UIImage*)image withRect:(CGRect)rect {
CGImageRef cgImage = CGImageCreateWithImageInRect(image.CGImage, rect);
UIImage *cropedImage = [UIImage imageWithCGImage:cgImage];
return cropedImage;
}
Upvotes: 0
Reputation: 425
It is likely that you are not recreating the final image with the correct scale factor.
UIImage *finalImage = [UIImage imageWithCGImage:resultImage
scale:original.scale
orientation:original.imageOrientation];
If this doesn't solve the issue, please provide more code sample from the camera input, and how you converted the final CIImage from the filters into UIImage.
Upvotes: 1