Reputation: 11
I am trying to downsample realtime video on iOS device (taking every second pixel for example) with one condition: doing it in 60 fps.
The resolution may be lowered, video screen's view can be only a small rectangle (about 200x200px). The result then should be previewed on the screen.
I have been using the excellent 2012 WWDC RosyWriter example as a starting point. But after many hours of searching, I cannot find even one tutorial nor a github project of someone who already did the same and can explain how this can be done. I have screened all the relevant questions in StackOverflow over and over without any success. I'm looking for a code sample, good answer or a quick reference to a tutorial on this subject.
Relevant but still unanswered questions:
Upvotes: 1
Views: 1178
Reputation: 2699
Have a look at the SimpleVideoFilter example in GPUImage.
Then eliminate the Sepia filter with something like this just to get a preview:
videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
GPUImageView *filterView = (GPUImageView *)self.view;
[videoCamera addTarget:filterView];
[videoCamera startCameraCapture];
This is the beginning of what Brad is suggested in the comments:
videoCamera = [[GPUImageVideoCamera alloc] initWithSessionPreset:AVCaptureSessionPreset640x480 cameraPosition:AVCaptureDevicePositionBack];
GPUImageLanczosResamplingFilter* filter = [[GPUImageLanczosResamplingFilter alloc] init];
GPUImageView *filterView = (GPUImageView *)self.view;
[videoCamera addTarget:filter];
[filter forceProcessingAtSize:self.view.frame.size];
[filter addTarget:filterView];
[videoCamera startCameraCapture];
The output from the filter would also be sent to a GPUImageMovieWriter. The SimpleVideoFilter example covers it.
Upvotes: 1