Léon Pelletier
Léon Pelletier

Reputation: 2741

How do you use Open GL ES 2.0 (shaders) for video processing?

This question is about iOS. On Android, it is very easy to use OpenGL ES 2.0 to render a texture on a view (for previewing) or to send it to an encoder (for file writing). I haven't been able to find any tutorial on iOS to achieve video playback (previewing video effect from a file) and video recording (saving a video with an effect) with shader effects. Is this something possible with iOS?

I've come across a demo about shaders called GLCameraRipple but I have no clue about how to use it more generically. Ex: With AVFoundation.

[EDIT]

I trampled on this tutorial about OpenGL ES, AVFoundation and video merging on iOS while searching for a snippet. That's another interesting entry door.

Upvotes: 2

Views: 969

Answers (1)

Tommy
Tommy

Reputation: 100632

It's all very low-level stuff over in iOS land, with a whole bunch of pieces to connect.

The main thing you're likely to be interested in is CVOpenGLESTextureCache. As the CV prefix implies, it's part of Core Video, in this case its primary point of interest is CVOpenGLESTextureCacheCreateTextureFromImage which "creates a live binding between the image buffer and the underlying texture object". The documentation further provides you with explicit advice on use of such an image as a GL_COLOR_ATTACHMENT — i.e. the texture ID returned is usable both as a source and as a destination for OpenGL.

The bound image buffer will be tied to a CVImageBuffer, one type of which is a CVPixelBuffer. You can supply pixel buffers to an AVAssetWriterInputPixelBufferAdaptor wired to an AVAssetWriter in order to output to a video.

In the other direction, an AVAssetReaderOutput attached to a AVAssetReader will vend CMSampleBuffers which can be queried for attached image buffers (if you've got video coming in and not just audio, there'll be some) that can then be mapped into OpenGL via a texture cache.

Upvotes: 3

Related Questions