Reputation: 193
What I need to do is to decode video frames and render the frames on a trapezoidal surface. I'm using Android 2.2 as my development platform
I'm not using the mediaplayer service since I need access to the decoded frames.
Here's what I have so far:
So now my problems are
So for my questions...
Upvotes: 4
Views: 6349
Reputation: 2218
Adding other video formats and codecs to stagefright
If you have parsers for "other" video formats, then you need to implement Stagefright media extractor plug-in and integrate into awesome player. Similarly if you have OMX Components for required Video Codecs, you need to integrate them into OMXCodec class. Using FFMPEG components in stagefright, or using FFMPEG player instead of stagefright does not seem trivial. However if required formats are already available in Opencore, then you can modify Android Stack so that Opencore gets chosen for those formats. You need to port the logic of getting YUV data to Opencore. (get dirty with MIOs)
Playback performance
The surface flinger, used for normal playback uses Overlay for rendering. It usually provides around 4 - 8 video buffers (so far what I have seen). So you can check how many different buffers you are getting in OPEN GL rendering. Increasing buffer will definitely improve the performance. Also, check time taken for YUV to RGB conversion. Can optimize or use opensource library to improve performance. Usually Open GL is not used for Video Rendering (known for Graphics). So not sure on the performance.
Audio Video Sync
Audio time is used as reference. In Stagefright, awesome player uses Audio Player for playing out audio. This player implements an interface for providing time data. Awesome player uses this for rendering Video. Basically Video frames are rendered when their presentation time matches with that of audio sample being played.
Shash
Upvotes: 2