miks
miks

Reputation: 23

Understanding AudioStreamBasicDescription

I'm trying to understand the AudioStreamBasicDescription results. Practically non of what I can get makes sense for me. For example:

AudioStreamBasicDescription(mSampleRate: 44100.0, mFormatID: 1819304813, mFormatFlags: 41, mBytesPerPacket: 4, mFramesPerPacket: 1, mBytesPerFrame: 4, mChannelsPerFrame: 2, mBitsPerChannel: 32, mReserved: 0)

What I would expect: "Bytes per packet" and "bytes per frame" should be 8 not 4:

4 (size of 32 bit Float) x 2 (two channels per frame) x 1 (1 frame per packet) = 8 bytes

Why is it 4?

import CoreAudio
import AudioUnit

var inputUnitDescription = AudioComponentDescription(componentType: kAudioUnitType_Output,
                                                     componentSubType: kAudioUnitSubType_HALOutput,
                                                     componentManufacturer: kAudioUnitManufacturer_Apple,
                                                     componentFlags: 0,
                                                     componentFlagsMask: 0)
let defaultInput = AudioComponentFindNext(nil, &inputUnitDescription)

var inputUnit: AudioUnit?
AudioComponentInstanceNew(defaultInput!, &inputUnit)

var asbd = AudioStreamBasicDescription()
var propertySize = UInt32(MemoryLayout<AudioStreamBasicDescription>.size)
AudioUnitGetProperty(inputUnit!,
                     kAudioUnitProperty_StreamFormat,
                     kAudioUnitScope_Output,
                     1,
                     &asbd,
                     &propertySize)

dump(asbd)

Upvotes: 2

Views: 1049

Answers (1)

hotpaw2
hotpaw2

Reputation: 70673

Your ABSD has mFormatFlags == 41 .

if (mFormatFlags & 32) != 0 , that means the format includes the kAudioFormatFlagIsNonInterleaved bit.

A non-interleaved format only returns one channel of data per frame, not 2. Instead you get multiple buffers, each buffer with only one channel per frame, or 4 bytes (for Float32 format), not 8.

Upvotes: 2

Related Questions