Reputation: 31
I'm working on a HAL virtual audio device.
I'm having problems getting the correct buffer size from the virtual audio device to my application...
How would I implement the properties kAudioDevicePropertyBufferFrameSize or kAudioDevicePropertyBufferFrameSizeRange to my virtual HAL device...
How would I do if I want to implement them to the apple nullaudio example found here: https://developer.apple.com/documentation/coreaudio/creating_an_audio_server_driver_plug-in
I tried to add them to my device the sam way as kAudioDevicePropertyNominalSampleRate is added to the nullAudio.c example. but with no success...
Upvotes: 2
Views: 257
Reputation: 796
You have to set kAudioDevicePropertyBufferFrameSize
in your client application (using AudioObjectSetPropertyData
).
You can't control the kAudioDevicePropertyBufferFrameSize
property from an AudioServerPlugin. It's only used by client processes to set the size of the IO buffers their IO procs receive.
When several clients use your device at the same time, CoreAudio lets them all use different IO buffer sizes (which might not be multiples/factors of each other), so your plug-in has to handle buffers of various sizes.
Source: https://lists.apple.com/archives/coreaudio-api/2013/Mar/msg00152.html
I'm not completely sure, but as far as I can tell, you can't control kAudioDevicePropertyBufferFrameSizeRange
from an AudioServerPlugin either.
Upvotes: 2