NJGUY
NJGUY

Reputation: 2085

Is there a limit to the number of nodes AVAudioEngine can create?

In my code below, I have created two sounds, sound1 and sound2. Each sound contains a number of samples that allow the same sound to be played simultaneously. The problem is that if I create seemingly more than 6 to 8 AVAudioPlayerNodes with a AVAudioUnitTimePitch for each, the audio gets completely messed up. I can't even play a single sound when I increase the number of samples too high. I'm not sure if my code is wrong, or what the node limit of AVAudioEngine is.

class AudioManager{
    var audioEngine:AVAudioEngine!;
    var mixer:AVAudioMixerNode!;
    var sound1:Sound!;
    var sound2:Sound!;
    init(){
        audioEngine = AVAudioEngine();
        mixer = audioEngine.mainMixerNode; //automatically creates instance of mixer node, output node, and connects

        do{
            try audioEngine.start();
        }catch let e as NSError{
            print("Error Starting AudioEngine \(e)");
        }

        sound1 = Sound(aManager: self, path: "assets/sounds/waterRefill", ofType: "mp3", numOfSamples: 7);
        sound2 = Sound(aManager: self, path: "assets/sounds/balloonCreate", ofType: "mp3", numOfSamples: 2);


    }

    func playSound(){
        sound1.play(1.0, pitch: 1.0);
    }

    func playSound2(){
        sound2.play(1.0, pitch: 1.0);
    }

    class Sound {
        var audioManager:AudioManager!;
        var audioFileBuffer:AVAudioPCMBuffer!;
        var numSamples:Int = 1;
        var audioIndex:Int = 0;
        var sampleList:[Sample] = [Sample]();

        init(aManager:AudioManager, path:String, ofType:String, numOfSamples:Int){
            audioManager = aManager;
            if(numOfSamples < 1){
                numSamples = 1;
            }else{
                numSamples = numOfSamples;
            }
            audioFileBuffer = createAudioBuffer(path, ofType: ofType);
            for (var i = 0; i < numSamples; i++){
                sampleList.append(Sample(sound: self));
            }
        }

        func createAudioBuffer(path:String, ofType:String)-> AVAudioPCMBuffer?{
            let filePath: String = NSBundle.mainBundle().pathForResource(path, ofType: ofType)!
            let fileURL: NSURL = NSURL(fileURLWithPath: filePath)
            do{
                let audioFile = try AVAudioFile(forReading: fileURL)
                let audioFormat = audioFile.processingFormat
                let audioFrameCount = UInt32(audioFile.length)
                let audioFileBuffer = AVAudioPCMBuffer(PCMFormat: audioFormat, frameCapacity: audioFrameCount)
                do{
                    try audioFile.readIntoBuffer(audioFileBuffer)
                    return audioFileBuffer;
                }catch let e as NSError{
                    print("Error loading Audio Into Buffer: \(e)");
                }
            }catch let e as NSError{
                print("Error loading Audio File: \(e)");
            }
            return nil;
        }

        private func runIndex(){
            if(audioIndex < (numSamples-1)){
                audioIndex++;
            }else{
                audioIndex = 0;
            }
        }

        func play(volume:Float, pitch:Float){

            var count:Int = 0;
            while(count < numSamples){
                if(numSamples > 1){
                    runIndex();
                }
                if (!sampleList[audioIndex].pitchPlayer.playing) {
                    sampleList[audioIndex].volume = volume;
                    sampleList[audioIndex].pitch = pitch;
                    sampleList[audioIndex].playSample();
                    break;
                }
                count++;
            }

        }

        class Sample{
            var parentSound:Sound!
            var pitchPlayer:AVAudioPlayerNode!;
            var timePitch:AVAudioUnitTimePitch!;
            var volume:Float = 1.0
            var pitch:Float = 1.0

            init(sound:Sound){
                parentSound = sound;
                pitchPlayer = AVAudioPlayerNode();
                timePitch = AVAudioUnitTimePitch();

                parentSound.audioManager.audioEngine.attachNode(pitchPlayer);
                parentSound.audioManager.audioEngine.attachNode(timePitch);

                parentSound.audioManager.audioEngine.connect(pitchPlayer, to: timePitch, format: parentSound.audioFileBuffer.format);
                parentSound.audioManager.audioEngine.connect(timePitch, to: parentSound.audioManager.mixer, format: parentSound.audioFileBuffer.format);


            }

            func playSample(){
                pitchPlayer.volume = volume;
                timePitch.pitch = pitch;
                print("Sample Play");

                pitchPlayer.play();
                pitchPlayer.scheduleBuffer(parentSound.audioFileBuffer, atTime: nil, options:.Interrupts, completionHandler: {[unowned self]() in
                    print("Is Stopped: \(self.pitchPlayer.playing)");
                    self.pitchPlayer.stop();
                    print("Is Stopped: \(self.pitchPlayer.playing)");
                    });
            }
        }
    }
}

Upvotes: 1

Views: 826

Answers (1)

Alex Machado
Alex Machado

Reputation: 1251

I have never heard of any limitation to the number of nodes in an AVAudioEngine graph, but I've seen some really bad performance after adding a few hundreds of nodes. The solution I found was to remove these nodes after they have finished playing.

The scheduleBuffer's completion handler is a good place to do that, but I would wrap the removal in a dispatch_async-to-the-main-queue call since the audio engine might still be using the node when it calls the completion handler.

Another option is to reuse player nodes after they finished playing the sample, instead of creating a new one for the next sample, but this approach might be a little more complex to implement.

Upvotes: 1

Related Questions