Reputation: 229
I'm currently bringing a legacy project up from iOS 5/6 to iOS 6/7.
Part of this project involves taking a picture using the GPUImage library, processing it with a crop filter, then optionally adding some saturation and blur effects. I am currently using version 0.1.2 installed via cocoa pods.
The problem I am having is that when I try to capture an image from the camera, I hit the following assert in GPUImageStillCamera.m line 254
if (CVPixelBufferGetPlaneCount(cameraFrame) > 0)
{
NSAssert(NO, @"Error: no downsampling for YUV input in the framework yet");
}
where cameraFrame
is a CVImageBufferRef
I have reproduced the code where this is called and move it to another project, where it works perfectly.
Once I moved this reproduced class back into the main project, I was hitting the assert every time.
Things I've ruled out with my own debugging
This has lead me to believe that perhaps it might be a project setting that I've over looked. Any help or even a pointer in the right direction would be very very welcome. I've spent a good 1-2 days on this now and am still entirely lost!
I've included the stripped down class below which shows the general use.
#import "ViewController.h"
#import "GPUImage.h"
#import "ImageViewController.h"
@interface ViewController ()
@property (nonatomic, strong) IBOutlet GPUImageView *gpuImageView;
@property (nonatomic, strong) GPUImageStillCamera *camera;
@property (nonatomic, strong) GPUImageCropFilter *cropFilter;
@end
@implementation ViewController
- (void)viewDidAppear:(BOOL)animated
{
[super viewDidAppear:animated];
[self setupCameraCapture];
}
- (void)setupCameraCapture
{
if (self.camera) {
return;
}
self.cropFilter = [[GPUImageCropFilter alloc] initWithCropRegion:CGRectMake(0, 0, 1, 0.5625)];
if ([UIImagePickerController isCameraDeviceAvailable:UIImagePickerControllerCameraDeviceRear]) {
self.camera = [[GPUImageStillCamera alloc] initWithSessionPreset:AVCaptureSessionPresetPhoto cameraPosition:AVCaptureDevicePositionBack];
}
else {
self.camera = [[GPUImageStillCamera alloc] initWithSessionPreset:AVCaptureSessionPresetPhoto cameraPosition:AVCaptureDevicePositionFront];
}
self.camera.outputImageOrientation = UIInterfaceOrientationPortrait;
NSError *error = nil;
[self.camera.inputCamera lockForConfiguration:&error];
[self.camera.inputCamera setExposureMode:AVCaptureExposureModeContinuousAutoExposure];
[self.camera.inputCamera setWhiteBalanceMode:AVCaptureWhiteBalanceModeContinuousAutoWhiteBalance];
if ([self.camera.inputCamera respondsToSelector:@selector(isLowLightBoostSupported)]) {
BOOL isSupported = self.camera.inputCamera.isLowLightBoostSupported;
if (isSupported) {
[self.camera.inputCamera setAutomaticallyEnablesLowLightBoostWhenAvailable:YES];
}
}
[self.camera.inputCamera unlockForConfiguration];
[self.camera addTarget:self.cropFilter];
[self.cropFilter addTarget:self.gpuImageView];
[self.camera startCameraCapture];
}
- (IBAction)capturePressed:(id)sender
{
[self.camera capturePhotoAsImageProcessedUpToFilter:self.cropFilter withCompletionHandler:^(UIImage *image, NSError *error) {
// do something with the image here
}];
}
@end
Upvotes: 0
Views: 593
Reputation: 229
The actual culprit was a swizzled method found by my colleague Marek. Hidden away in the depths of the old codebase. The above code works fine.
Lesson: if you really have to swizzle something, make sure you leave proper documentation for the future devs.
Upvotes: 3