Reputation: 8651
I use GPUImageStillCamera to take a picture and load the picture into a GPUImageView
Now how can I apply a filter to a still GPUImageView?
The only examples I see online are to first create a GPUImagePicture (which requires a UIImage) but it seems rather wasteful to have to create a UIImage from GPUImageView then load it into the GPUImagePicture when I could just apply the filters to the GPUImageView directly. But I don't know how
I've tried:
GPUImageFilter *sepiaFilter [[GPUImageSepiaFilter alloc]init];
[sepiaFilter addTarget:gpuImageView];
//and then tried these in a bunch of different orders
[filter useNextFrameForImageCapture];
[currentFilter endProcessing];
[currentFilter imageFromCurrentFramebuffer];
EDIT: Here is how I load the image into the GPUImageView
self.stillCamera = [[GPUImageStillCamera alloc]
initWithSessionPreset:AVCaptureSessionPresetPhoto
cameraPosition:AVCaptureDevicePositionFront];
self.stillCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
self.filterEmpty = [[GPUImageGammaFilter alloc] init];
[self.stillCamera addTarget:self.filterEmpty];
[self.filterEmpty addTarget:self.capturedImageView];
[self.stillCamera startCameraCapture];
[self.stillCamera capturePhotoAsJPEGProcessedUpToFilter:self.filterEmpty withCompletionHandler:^(NSData *processedJPEG, NSError *error){
[self.stillCamera stopCameraCapture];
//and then from here the gpuimageview is loaded with the taken picture
Upvotes: 1
Views: 1126
Reputation: 34780
Have a regular image view on top of GPUImageView.
Let's call that UIImageVIew imageView
.
[self.stillCamera capturePhotoAsImageProcessedUpToFilter:self.filterEmpty
withCompletionHandler:^(UIImage *processedImage, NSError *error) {
imageView.image = processedImage;
//stop, dispose of, or just continue with the camera,
//depending on what you want with your app.
}];
At least, this is how I do it on a realtime photo filtering app, and it works perfectly.
Upvotes: 0