HS__HS
HS__HS

Reputation: 103

AurioTouch purpose of AudioBufferList

In the Apple provided aurioTouch project, I heard that the AudioBufferList ioData in the AudioController file under the performRender() function carries the audio data from the mic to the audio player? Can anyone confirm this? Here is the code:

// Render callback function
static OSStatus performRender (void                         *inRefCon,
                           AudioUnitRenderActionFlags   *ioActionFlags,
                           const AudioTimeStamp         *inTimeStamp,
                           UInt32                       inBusNumber,
                           UInt32                       inNumberFrames,
                           AudioBufferList              *ioData)
{
OSStatus err = noErr;
if (*cd.audioChainIsBeingReconstructed == NO)
{
    // we are calling AudioUnitRender on the input bus of AURemoteIO
    // this will store the audio data captured by the microphone in ioData
    err = AudioUnitRender(cd.rioUnit, ioActionFlags, inTimeStamp, 1, inNumberFrames, ioData);

    // filter out the DC component of the signal
    cd.dcRejectionFilter->ProcessInplace((Float32*) ioData->mBuffers[0].mData, inNumberFrames);

    // based on the current display mode, copy the required data to the buffer manager
    if (cd.bufferManager->GetDisplayMode() == aurioTouchDisplayModeOscilloscopeWaveform)
    {
        cd.bufferManager->CopyAudioDataToDrawBuffer((Float32*)ioData->mBuffers[0].mData, inNumberFrames);
    }

    else if ((cd.bufferManager->GetDisplayMode() == aurioTouchDisplayModeSpectrum) || (cd.bufferManager->GetDisplayMode() == aurioTouchDisplayModeOscilloscopeFFT))
    {
        if (cd.bufferManager->NeedsNewFFTData())
            cd.bufferManager->CopyAudioDataToFFTInputBuffer((Float32*)ioData->mBuffers[0].mData, inNumberFrames);
    }

    // mute audio if needed
    if (*cd.muteAudio)
    {
        for (UInt32 i=0; i<ioData->mNumberBuffers; ++i)
            memset(ioData->mBuffers[i].mData, 0, ioData->mBuffers[i].mDataByteSize);
    }
}

return err;
}

Upvotes: 1

Views: 525

Answers (1)

jaybers
jaybers

Reputation: 2211

The RemoteIO audio unit is a component that is able to access both audio hardware input (microphones) and output (speaker).

The input gets data from the mic or a buffer. The output puts audio data to the speaker or a buffer.

Since the input gets data from the microphone, you can do whatever you want with it. You could save it to a file. You could send it up a network stream. You could keep it in memory.

For the output, the audio output device and speaker are requesting data to be played.

So in the software your provided, all they did is connect the microphone to the speaker as a direct audio pass through.

The function performRender() is called periodically by the audio system and it says "Gimme some audio samples to play." Inside that function, it reads from data produced by the mic in the line: AudioUnitRender(cd.rioUnit, ioActionFlags, inTimeStamp, 1, inNumberFrames, ioData);

You could replace the insides of performRender() and make your own audio data programmatically, read from a file or buffer , etc. All they did is just read from the mic data.

As for your question about the purpose of AudioBufferList. That just provides a list of buffers where each buffer is a channel. Sometimes you have more than one channel of audio depending on the format (mono, stereo, stereo interleaved, mixer channels, etc), and type of unit.

Upvotes: 1

Related Questions