LolliPop
LolliPop

Reputation: 107

Dependency between wave data format and input buffer size

I do some quick solution using VFW. At the stage of preparing to receive audio data I ask VFW to give me the default wave data format for the attached WEBcam. It returns me WAVEFORMATEX struct, pls, see the pic.

pic1

As you can see it's 1 byte for sample and 1 channel.. 11025 bytes per sec. But the callBack receives another amount o bytes, pls, see the pic pic 2

And the question is: what is the correlation and dependency between these to values? How to separate data in such case if there are 2 channels?

Upvotes: 0

Views: 87

Answers (1)

Roman Ryltsov
Roman Ryltsov

Reputation: 69632

dwBufferLength is the size of the buffer, not the amount of captured bytes. You are interested in another member: dwBytesRecorded.

In case of stereo WAVEFORMATEX will have 2 in nChannels, and then nBlockAlign, nAvgBytesPerSec will be adjusted respectively. The bytes in the buffers will have the following packing:

0000: [sample 0, channel 0]
0001: [sample 0, channel 1]
0002: [sample 1, channel 0]
0003: [sample 1, channel 1]
...

Upvotes: 0

Related Questions