Reputation: 29877
I read the following from the official docs on MediaCodec:
Raw audio buffers contain entire frames of PCM audio data, which is one sample for each channel in channel order. Each PCM audio sample is either a 16 bit signed integer or a float, in native byte order.
https://source.android.com/devices/graphics/arch-sh
The way I read this is that a buffer contains an entire frame of audio but a frame is just one signed integer. This doesn't seem to make sense. Or is this two values for the left and right audio? Why call it a buffer when it only contains a single value? To me, a buffer refers to several values spanning several milliseconds.
Upvotes: 3
Views: 99
Reputation: 10621
Here's what the docs for AudioFormat
say:
For linear PCM, an audio frame consists of a set of samples captured at the same time, whose count and channel association are given by the channel mask, and whose sample contents are specified by the encoding. For example, a stereo 16 bit PCM frame consists of two 16 bit linear PCM samples, with a frame size of 4 bytes.
You are right that it doesn't make sense to use a buffer for just one frame. And in practice buffers are filled with many frames.
You can figure out the number of frames in a buffer from the size
property of MediaCodec.BufferInfo
and the frame size.
Upvotes: 1