Reputation: 4248
I'm facing the following problem using GPUImage:
1st I'm adding a grayscale filter to camera, then I'm using GPUImageAverageColor
to get the average colour. The problem is that the colour I'm getting through the block is not in grayscale range. What I'm doing wrong?
This is the code:
GPUImageVideoCamera *videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
videoCamera.outputImageOrientation = UIInterfaceOrientationPortrait;
GPUImageView *filterView = (GPUImageView *)self.cameraLayer;
[filterView setFillMode:kGPUImageFillModePreserveAspectRatioAndFill];
GPUImageGrayscaleFilter *grayscaleFilter = [[GPUImageGrayscaleFilter alloc] init];
[videoCamera addTarget:grayscaleFilter];
[grayscaleFilter addTarget:filterView];
GPUImageAverageColor *averageColor = [[GPUImageAverageColor alloc] init];
[averageColor setColorAverageProcessingFinishedBlock:^(CGFloat redComponent, CGFloat greenComponent, CGFloat blueComponent, CGFloat alphaComponent, CMTime frameTime){
NSLog(@"Red: %f, green: %f, blue: %f, alpha: %f", redComponent, greenComponent, blueComponent, alphaComponent);
}];
[videoCamera addTarget:averageColor];
[videoCamera startCameraCapture];
Upvotes: 0
Views: 406
Reputation: 170319
That's because you're not actually running the average color operation on your grayscale output.
Look at how you've targeted the filters. You're sending the output of the video camera directly to your average color operation. You need to instead send the output of the grayscale filter to the average color operation.
Upvotes: 1