Reputation: 21
In toneRender I fill the buffer with one tone at frequency and another at frequency/3.
When I run the code, it appears as if the output is read as is the buffer was not interleaved. As indeed it is set in createToneUnit. The sound is played only in the left speaker. When both frequencies are written into the buffer both tones are played in the left speaker. When the frequencies are not written into the buffer, e.g. leftON = 0, they are not played. so the buffer writing code appears to be OK.
Since I suspect that I should not have the kLinearPCMFormatFlagIsNonInterleaved set in createToneUnit, I tried to "clear" the flag. I read documents for hours, but never found a way to do this. Experimenting only resulted in crashes on app launch.
How can I clear kLinearPCMFormatFlagIsNonInterleaved?
Or how can I not set kLinearPCMFormatFlagIsNonInterleaved in the first place? (Commenting out streamFormat.mFormatFlags creates a crash as well.)
Possibly some other settings interfere with creating an interleaved playback.
OSStatus RenderTone(
void *inRefCon,
AudioUnitRenderActionFlags *ioActionFlags,
const AudioTimeStamp *inTimeStamp,
UInt32 inBusNumber,
UInt32 inNumberFrames,
AudioBufferList *ioData)
{
*)inRefCon;
float sampleRate = viewController->sampleRate;
float frequency = viewController->frequency;
// etc.
float theta_increment = 2.0 * M_PI * frequency /sampleRate;
float wave;
float theta2;
float wave2;
float theta_increment2 =0.3 * theta_increment;
const int channel = 0;
Float32 *buffer = (Float32 *)ioData->mBuffers[channel].mData;
for (UInt32 frame = 0; frame < inNumberFrames;)
{
theta += theta_increment;
wave = sin(theta) * playVolume;
theta2 += theta_increment2;
wave2 = sin(theta2) * playVolume;
buffer[frame++] = wave * leftON; // leftON = 1 or 0
buffer[frame++] = wave2 * rightON; // rightON = 1 or 0
if (theta > 2.0 * M_PI)
{
theta -= 2.0 * M_PI;
}
}
// etc.
}
- (void)createToneUnit
{
AudioComponentDescription defaultOutputDescription;
defaultOutputDescription.componentType = kAudioUnitType_Output;
defaultOutputDescription.componentSubType = kAudioUnitSubType_RemoteIO;
defaultOutputDescription.componentManufacturer = kAudioUnitManufacturer_Apple;
defaultOutputDescription.componentFlags = 0;
defaultOutputDescription.componentFlagsMask = 0;
// etc.
kAudioUnitProperty_SetRenderCallback,
kAudioUnitScope_Input,
0,
&input,
sizeof(input));
const int four_bytes_per_float = 4;
const int eight_bits_per_byte = 8;
AudioStreamBasicDescription streamFormat;
streamFormat.mSampleRate = sampleRate;
streamFormat.mFormatID = kAudioFormatLinearPCM;
streamFormat.mFormatFlags =
kLinearPCMFormatFlagIsFloat | kLinearPCMFormatFlagIsNonInterleaved;
streamFormat.mBytesPerPacket = four_bytes_per_float;
streamFormat.mFramesPerPacket = 1;
streamFormat.mBytesPerFrame = four_bytes_per_float;
streamFormat.mChannelsPerFrame = 2; // 2= stereo /
streamFormat.mBitsPerChannel = four_bytes_per_float * eight_bits_per_byte;
err = AudioUnitSetProperty (toneUnit,
kAudioUnitProperty_StreamFormat,
kAudioUnitScope_Input,
0,
&streamFormat,
sizeof(AudioStreamBasicDescription));
}
Upvotes: 0
Views: 370
Reputation: 70703
In your case, you could just not set the flag by doing:
streamFormat.mFormatFlags = kLinearPCMFormatFlagIsFloat;
and thus not bit-OR the other flag into the computed flag.
(Look up C bit-wise operators and how they work).
Upvotes: 1