Reputation: 4203
So this is a more theoretical question/discussion, as I haven't been able to come to a clear answer reading other SO posts and sources from the web. It seems like there are a lot of options:
Brad Larson's comment about AVFoundation
If I want to do hardware decoding on iOS for H.264 (mov) files , can I simply use the AVFoundation and AVAssets, or should I use VideoToolbox (or any other frameworks). When using these, how can I profile/benchmark the hardware performance when running a project? - Is it by simply looking at the CPU usage in the "Debug Navigator" in XCode?
In short, I'm basically asking if AVFoundation & AVAssets perform hardware encoding or not? Are they sufficient, and how do I benchmark the actual performance?
Thanks!
Upvotes: 9
Views: 2726
Reputation: 8244
If you want to decode a local file that is already on your iOS device - I'd use AVFoundation.
If you want to decode a network stream (RTP or RTMP) use Video Toolbox - since you have to unpack the video stream yourself.
With AVFoundation or Video Toolbox you will get hardware decoding.
Upvotes: 1