qrtt1
qrtt1

Reputation: 7957

How do I render frames from FFmpeg to the screen in an iOS app?

I am newbie to iOS, but I have implemented an FFmpeg-based playback routine on Android. I am planning to do it again on iOS.

It appears that I can use OpenGL ES 1.1 / 2.0 to draw frames from FFmpeg video to the screen. However, OpenGL ES seems difficult. Additionally, I ran into some limits due to the width of the video texture. I had to split the video frame into many images and draw them to the screen to compose the frame.

Is there an easier way to render this video using OpenGL ES on iOS? Is there any other way to draw 2-D video frames to the screen quickly in iOS?

Upvotes: 1

Views: 1500

Answers (1)

appas
appas

Reputation: 4138

Ah. So you want to render a non-POT source. This can be done without splitting to multiple textures - by creating the closest POT-sized texture, rendering to that and blitting only the part that actually contains the image. Have a look here for an example (C++). The relevant parts:

//Calculating the texture size
double exp = ceil(log((double)max(Texture.HardwareHeight, Texture.HardwareWidth))/log(2.0));
texsize = min(pow(2, exp), (double)maxSize);

then

//draw the original frame 1:1 into the (larger) texture
float ymax = min(2.0f*((float)Texture.HardwareHeight/(float)texsize) - 1.0f, 1.0f);     //clamping is for cards with
float xmax = min(2.0f*((float)Texture.HardwareWidth/(float)texsize) - 1.0f, 1.0f);      //smaller max hardware texture than the current frame size

Use these maximum values instead of 1.0 when as texture coordinates when rendering.

Upvotes: 1

Related Questions