Curnelious
Curnelious

Reputation: 1

audio-unit sample rate and buffer size

i am facing a really misunderstanding when sampling the iphone audio with remoteIO.

from one side, i can do this math: 44khz sample rate means 44 samples per 1ms. which means if i set bufferSize to 0.005 with :

float bufferLength = 0.00005;
     AudioSessionSetProperty(kAudioSessionProperty_PreferredHardwareIOBufferDuration, sizeof(bufferLength), &bufferLength);

which means 5ms buffer size -which means 44*5=220 samples in buffer each callback. BUT i get 512 samples from inNumberFrames each callback . and it stay's fixed even when i change buffer length.

another thing , my callbacks are every 11ms and is not changing! i need faster callbacks .

so ! what is going on here ? who set what ?

i need to pass a digital information in an FSK modulation, and have to know exactly buffer size in samples, and what time from the signal it has , in order to know how to FFT it correctly .

any explanation on this ? thanks a lot.

Upvotes: 1

Views: 7150

Answers (3)

hotpaw2
hotpaw2

Reputation: 70673

There is no way on all current iOS 10 devices to get RemoteIO audio recording buffer callbacks at a faster rate than every 5 to 6 milliseconds. The OS may even decide to switch to sending even larger buffers at a lower callback rate at runtime. The rate you request is merely a request, the OS then decides on the actual rates that are possible for the hardware, device driver, and device state. This rate may or may not stay fixed, so your app will just have to deal with different buffer sizes and rates.

One of your options might be to concatenate each callback buffer onto your own buffer, and chop up this second buffer however you like outside the audio callback. But this won't reduce actual latency.

Added: some newer iOS devices allow returning audio unit buffers that are shorter than 5.x mS in duration, usually a power of 2 in size at a 48000 sample rate.

Upvotes: 5

jaybers
jaybers

Reputation: 2211

The audio session property is a suggested value. You can put in a really tiny number but will just go to the lowest value it can. The fastest that I have seen on an iOS device when using 16 bit stereo was 0.002902 second ( ~3ms ).

That is 128 samples (LR stereo frames) per callback. Thus, 512 bytes per callback. So 128/44100 = 0.002902 seconds.

You can check it with:

AudioSessionGetProperty(kAudioSessionProperty_CurrentHardwareIOBufferDuration, &size, &bufferDuration)

Could the value 512 in the original post have meant bytes instead of samples?

Upvotes: 1

justin
justin

Reputation: 104698

i need to pass a digital information in an FSK modulation, and have to know exactly buffer size in samples, and what time from the signal it has , in order to know how to FFT it correctly.

It doesn't work that way - you don't mandate various hosts or hardware to operate in an exact manner which is optimal for your processing. You can request reduced latency - to a point. Audio systems generally pass streaming pcm data in blocks of samples sized by a power of two for efficient realtime io.

You would create your own buffer for your processor, and report latency (where applicable). You can attempt to reduce wall latency by choosing another sample rate, or by using a smaller N.

Upvotes: 3

Related Questions