Reputation: 131426
The iOS implementation of Core Image filters includes a category of filters CICategoryVideo that can presumably be used to process a live video stream. This implies that there is a workflow that is fast enough to take a frame of video, apply one or more filters to it, and then pass the resulting frame along for display/saving.
Does anybody know of a sample project that illustrates how this is done? All I've done so far with CIFilters is to convert a UIImage to a CIImage, process it with a CIFilter, and then convert it back to a UIImage for display.
I assume that to use a CI filter in a video processing stream you have to use Core Video pixelBuffers, map them to CIImages, process them, and then map the results back to a pixel buffer. I've done that type of processing with OpenGL, and have even converted frames of video to CIImages for face detection in a video stream, but don't know how to get the output of a CIFilter back into pixelBuffer fast enough to keep up with the framerate of a video.
Upvotes: 1
Views: 1364
Reputation: 2754
This project does exactly what you are describing, using CICategoryVideo:
https://developer.apple.com/library/content/samplecode/CIFunHouse/Introduction/Intro.html
The CIFunHouse project shows how to apply Core Image built in and custom CIFilters to photos and video. The application presents view controllers for adding photo and video sources, choosing CIFilters from a list, and making live adjustments to filter parameters. The project also contains code for custom CIFilter subclasses for effect such as Sobel edge detection, old-style-film, and fake-depth-of-field looks. The code also demonstrates how to save a filtered video stream to the ALAssetsLibrary while simultaneously previewing the video on the display.
Sorry the answer came 3 years late.
Upvotes: 0