babybluesedan
babybluesedan

Reputation: 33

AVFoundation positional audio not working on iOS device

I'm trying to play positional audio in a Swift iOS app using AVAudioEngine and AVAudioEnvironmentNode. I can successfully play the audio fine and hear it spatialized, shifting between both outputs in stereo, but only in the simulator. When I run the same app on an iPhone, the audio plays but in both ears rather than panning when the source is moved around. Is there some special configuration I need to do, like manually handling the device audio output?

I initialize the audio engine and player like so:

let audioEngine = AVAudioEngine()
let audioEnv = AVAudioEnvironmentNode()

audioEngine.attach(audioEnv)
audioEngine.connect(
    audioEnv,
    to: audioEngine.mainMixerNode,
    format: audioEnv.outputFormat(forBus: 0)
)
try audioEngine.start()

let player = AVAudioPlayerNode()
audioEngine.attach(player)
audioEngine.connect(
    player,
    to: audioEnv,
    format: AVAudioFormat(standardFormatWithSampleRate: 44100, channels: 1)
)

player.scheduleFile(...)
player.play()

My source files are mono channel .wav.

At some point in the future, I change the position of the player:

player.position = AVAudio3DPoint(x: 5, y: 0, z: 0)

This should play only (or mostly) in one ear. When run in the iOS simulator, it does exactly what I expect. However, on an actual device it just plays evenly in both ears no matter what player.position is set to. I suspect it has to do with the configuration of audioEngine.

Thoughts?

Upvotes: 1

Views: 650

Answers (1)

EPage_Ed
EPage_Ed

Reputation: 1183

Try setting:

audioEnv.renderingAlgorithm = .HRTFHQ // or .HRTF

Upvotes: 1

Related Questions