Reputation: 77
What would be the best way to calculate the decode time of a frame decoded by mediacodec? the execution time of the code below is clearly not the correct time. Is there any way to know how long each frame/frames took to decode?
Thanks.
startTime...
dequeueInputBuffer();
getInputBuffer();
// copy frame to input buffer
queueInputBuffer();
dequeueOutputBuffer();
releaseOutputBuffer();
stopTime...
exectime = startTime - StopTime
Upvotes: 1
Views: 949
Reputation: 52303
It's difficult to get a meaningful measurement of the time required to decode a single frame, because you'll be measuring latency as well as throughput. Data has to be passed from the app to the mediaserver process, into the driver, decoded, and then the decoded data has to make the same journey in reverse. There can be additional pipelining in the driver itself.
You can get a reasonable approximation by decoding a few hundred frames and then dividing the total time by the number of frames.
What is it you're trying to accomplish?
Upvotes: 2