user3078414
user3078414

Reputation: 1937

OSX: CoreAudio API for setting IO Buffer length?

This is a follow-up to a previous question: OSX CoreAudio: Getting inNumberFrames in advance - on initialization?

I am trying to figure out what will be the AudioUnit API for possibly setting inNumberFrames or preffered IO buffer duration of an input callback for a single HAL audio component instance in OSX (not a plug-in!). While I understand there is a comprehensive documentation on how this can be achieved in iOS, by means of AVAudioSession API, I can neither figure out nor find documentation on setting these values in OSX, whichever API. The web is full of expert, yet conflicting statements ranging from "There is an Audio Unit API to request a sample rate and a preferred buffer duration...", to "You can definitely get the number of frames, but only for the current callback call...".

Is there a way of at least getting (and adapting to) the inNumberFrames or the audio buffer length offerd by the system, for the input-selected sampling rates in OSX? For example, for 44.1k and its multiples (this seems to work partly), as well as for 48k and its multiples (this doesn't seem to work at all, I don't know where's the hack which allows for adapting the buffer lenfth to these values)? Here's the console printout:

Available 7 Sample Rates
Available Sample Rate value : 8000.000000
Available Sample Rate value : 16000.000000
Available Sample Rate value : 32000.000000
Available Sample Rate value : 44100.000000
Available Sample Rate value : 48000.000000
Available Sample Rate value : 88200.000000
Available Sample Rate value : 96000.000000

.mSampleRate          =   48000.00
.mFormatID            = 1819304813
.mBytesPerPacket      = 8
.mFramesPerPacket     = 1
.mBytesPerFrame       = 8
.mChannelsPerFrame    = 2
.mBitsPerChannel      = 32
.mFormatFlags         = 9
_mFormatHumanReadable = kAudioFormatFlagIsFloat 
    kAudioFormatFlagIsPacked 
    kLinearPCMFormatFlagIsFloat 
    kLinearPCMFormatFlagIsPacked 
    kLinearPCMFormatFlagsSampleFractionShift 
    kAppleLosslessFormatFlag_16BitSourceData 
    kAppleLosslessFormatFlag_24BitSourceData 

expectedInNumberFrames = 512

Couldn't render in current context (Error -10863)

The expected inNumberFrames is read from the system:

UInt32 expectedInNumberFrames = 0;
UInt32 propSize = sizeof(UInt32);
AudioUnitGetProperty(gInputUnitComponentInstance,
                     kAudioDevicePropertyBufferFrameSize,
                     kAudioUnitScope_Global,
                     0,
                     &expectedInNumberFrames,
                     &propSize);

Thanks in advance for pointing me at the right direction!

Upvotes: 4

Views: 3157

Answers (1)

hotpaw2
hotpaw2

Reputation: 70693

See this Apple Technical Note: https://developer.apple.com/library/mac/technotes/tn2321/_index.html#//apple_ref/doc/uid/DTS40013499-CH1-THE_I_O_BUFFER_SIZE

See the OS X example code in this technical note for GetIOBufferFrameSizeRange(), GetCurrentIOBufferFrameSize(), and SetCurrentIOBufferFrameSize().

Note that there is an API property returning an allowed range, and an error return on the property setter. Also note the various Mac power saving modes may change the buffer size while an app is running, so the actual buffer size, inNumberFrames, may not stay constant, or even be known until the Audio Unit starts running.

If you get unusual buffer sizes (not a power of 2), it may be that the actual audio hardware on a particular Apple product model has a fixed or limited range of audio sample rates, and thus OS software is being used to resample and thus resize the buffers being sent to audio unit callbacks depending on that hardware, if the app requests a sample rate not supported by the actual codec chips on the circuit board.

Upvotes: 5

Related Questions