M.D
M.D

Reputation: 21

Details on using the AVAudioEngine

Background: I found one of Apple WWDC sessions called "AVAudioEngine in Practice" and am trying to make something similar to the last demo shown at 43:35 (https://youtu.be/FlMaxen2eyw?t=2614). I'm using SpriteKit instead of SceneKit but the principle is the same: I want to generate spheres, throw them around and when they collide the engine plays a sound, unique to each sphere.

Problems:

Code: As mentioned in the video, "...for every ball that's born into this world, a new player node is also created". I have a separate class for the spheres, with a method that returns a SpriteKitNode and also creates an AudioPlayerNode every time it is called :

class Sphere {

    var sphere: SKSpriteNode = SKSpriteNode(color: UIColor(), size: CGSize())
    var sphereScale: CGFloat = CGFloat(0.01)
    var spherePlayer = AVAudioPlayerNode()
    let audio = Audio()
    let sphereCollision: UInt32 = 0x1 << 0

    func createSphere(position: CGPoint, pitch: Float) -> SKSpriteNode {

        let texture = SKTexture(imageNamed: "Slice")
        let collisionTexture = SKTexture(imageNamed: "Collision")

        // Define the node

        sphere = SKSpriteNode(texture: texture, size: texture.size())

        sphere.position = position
        sphere.name = "sphere"
        sphere.physicsBody = SKPhysicsBody(texture: collisionTexture, size: sphere.size)
        sphere.physicsBody?.dynamic = true
        sphere.physicsBody?.mass = 0
        sphere.physicsBody?.restitution = 0.5
        sphere.physicsBody?.usesPreciseCollisionDetection = true
        sphere.physicsBody?.categoryBitMask = sphereCollision
        sphere.physicsBody?.contactTestBitMask = sphereCollision
        sphere.zPosition = 1

        // Create AudioPlayerNode

        spherePlayer = audio.createPlayer(pitch)

        return sphere    
    }

Here's my Audio Class with which I create AudioPCMBuffers and AudioPlayerNodes

class Audio {

let engine: AVAudioEngine = AVAudioEngine()

func createBuffer(name: String, type: String) -> AVAudioPCMBuffer {

    let audioFilePath = NSBundle.mainBundle().URLForResource(name as String, withExtension: type as String)!
    let audioFile = try! AVAudioFile(forReading: audioFilePath)
    let buffer = AVAudioPCMBuffer(PCMFormat: audioFile.processingFormat, frameCapacity: UInt32(audioFile.length))
    try! audioFile.readIntoBuffer(buffer)

    return buffer
}

func createPlayer(pitch: Float) -> AVAudioPlayerNode {

    let player = AVAudioPlayerNode()
    let buffer = self.createBuffer("PianoC1", type: "wav")
    let pitcher = AVAudioUnitTimePitch()
    let delay = AVAudioUnitDelay()
    pitcher.pitch = pitch
    delay.delayTime = 0.2
    delay.feedback = 90
    delay.wetDryMix = 0

    engine.attachNode(pitcher)
    engine.attachNode(player)
    engine.attachNode(delay)

    engine.connect(player, to: pitcher, format: buffer.format)
    engine.connect(pitcher, to: delay, format: buffer.format)
    engine.connect(delay, to: engine.mainMixerNode, format: buffer.format)

    engine.prepare()
    try! engine.start()

    return player
}  
}

In my GameScene class I then test for collision, schedule a buffer and play the AudioPlayerNode if contact has occurred

 func didBeginContact(contact: SKPhysicsContact) {

        let firstBody: SKPhysicsBody = contact.bodyA

        if (firstBody.categoryBitMask & sphere.sphereCollision != 0) {

        let buffer1 = audio.createBuffer("PianoC1", type: "wav")
        sphere.spherePlayer.scheduleBuffer(buffer1, atTime: nil, options: AVAudioPlayerNodeBufferOptions.Interrupts, completionHandler: nil)
        sphere.spherePlayer.play()

        }
}

I'm new to Swift and only have basic knowledge of programming so any suggestion/criticism is welcome.

Upvotes: 2

Views: 2053

Answers (1)

triple7
triple7

Reputation: 574

I've been working on AVAudioEngine in scenekit and trying to do something else, but this will be what you are looking for:

https://developer.apple.com/library/mac/samplecode/AVAEGamingExample/Listings/AVAEGamingExample_AudioEngine_m.html

It explains the process of: 1-Instantiating your own AVAudioEngine sub-class 2-Methods to load PCMBuffers for each AVAudioPlayer 3-Changing your Environment node's parameters to accomodate the reverb for the large number of pinball objects

Edit: Converted, tested and added a few features:

1-You create a subclass of AVAudioEngine, name it AudioLayerEngine for example. This is to access the AVAudioUnit effects such as distortion, delay, pitch and many of the other effects available as AudioUnits. 2-Initialise by setting up some configurations for the audio engine, such as rendering algorithm, exposing the AVAudioEnvironmentNode to play with 3D positions of your SCNNode objects or SKNode objects if you are in 2D but want 3D effects 3-Create some helper methods to load presets for each AudioUnit effect you want 4-Create a helper method to create an audio player then add it to whatever node you want, as many times as you want since that SCNNode accepts a .audioPlayers methods which returns [AVAudioPlayer] or [SCNAudioPlayer] 5-Start playing.

I've pasted the entire class for reference so that you can then structure it as you wish, but keep in mind that if you are coupling this with SceneKit or SpriteKit, you use this audioEngine to manage all your sounds instead of SceneKit's internal AVAudioEngine. This means that you instantiate this in your gameView during the AwakeFromNib method

import Foundation
import SceneKit
import AVFoundation

class AudioLayerEngine:AVAudioEngine{
    var engine:AVAudioEngine!
    var environment:AVAudioEnvironmentNode!
    var outputBuffer:AVAudioPCMBuffer!
    var voicePlayer:AVAudioPlayerNode!
    var multiChannelEnabled:Bool!
    //audio effects
    let delay = AVAudioUnitDelay()
    let distortion = AVAudioUnitDistortion()
    let reverb = AVAudioUnitReverb()

    override init(){
        super.init()
engine = AVAudioEngine()
environment = AVAudioEnvironmentNode()

engine.attachNode(self.environment)
voicePlayer = AVAudioPlayerNode()
engine.attachNode(voicePlayer)
voicePlayer.volume = 1.0
        outputBuffer = loadVoice()
        wireEngine()
        startEngine()
voicePlayer.scheduleBuffer(self.outputBuffer, completionHandler: nil)
voicePlayer.play()
    }

    func startEngine(){
        do{
            try engine.start()
        }catch{
            print("error loading engine")
        }
    }

    func loadVoice()->AVAudioPCMBuffer{
        let URL = NSURL(fileURLWithPath: NSBundle.mainBundle().pathForResource("art.scnassets/sounds/interface/test", ofType: "aiff")!)
        do{
            let soundFile = try AVAudioFile(forReading: URL, commonFormat: AVAudioCommonFormat.PCMFormatFloat32, interleaved: false)
             outputBuffer = AVAudioPCMBuffer(PCMFormat: soundFile.processingFormat, frameCapacity: AVAudioFrameCount(soundFile.length))
            do{
            try soundFile.readIntoBuffer(outputBuffer)
            }catch{
                print("somethign went wrong with loading the buffer into the sound fiel")
            }
            print("returning buffer")
            return outputBuffer
        }catch{
        }
        return outputBuffer
    }

    func wireEngine(){
loadDistortionPreset(AVAudioUnitDistortionPreset.MultiCellphoneConcert)
        engine.attachNode(distortion)
        engine.attachNode(delay)
engine.connect(voicePlayer, to: distortion, format: self.outputBuffer.format)
        engine.connect(distortion, to: delay, format: self.outputBuffer.format)
                engine.connect(delay, to: environment, format: self.outputBuffer.format)
        engine.connect(environment, to: engine.outputNode, format: constructOutputFormatForEnvironment())

    }

    func constructOutputFormatForEnvironment()->AVAudioFormat{
let outputChannelCount = self.engine.outputNode.outputFormatForBus(1).channelCount
let hardwareSampleRate = self.engine.outputNode.outputFormatForBus(1).sampleRate
let environmentOutputConnectionFormat = AVAudioFormat(standardFormatWithSampleRate: hardwareSampleRate, channels: outputChannelCount)
multiChannelEnabled = false
        return environmentOutputConnectionFormat
    }

    func loadDistortionPreset(preset: AVAudioUnitDistortionPreset){
        distortion.loadFactoryPreset(preset)
}

    func createPlayer(node: SCNNode){
        let player = AVAudioPlayerNode()
distortion.loadFactoryPreset(AVAudioUnitDistortionPreset.SpeechCosmicInterference)
engine.attachNode(player)
engine.attachNode(distortion)
engine.connect(player, to: distortion, format: outputBuffer.format)
        engine.connect(distortion, to: environment, format: constructOutputFormatForEnvironment())
let algo = AVAudio3DMixingRenderingAlgorithm.HRTF
        player.renderingAlgorithm = algo
        player.reverbBlend = 0.3
        player.renderingAlgorithm = AVAudio3DMixingRenderingAlgorithm.HRTF
    }

}

Upvotes: 2

Related Questions