Logan
Logan

Reputation: 1127

Is there any relationship between an AVAudioEngine and an AVAudioSession?

I understand that this question might get a bad rating, but I've been looking at questions which ask how to reroute audio output to the loud speaker on iOS devices.

Every question I looked at the user talked about using your AVAudioSession to reroute it.. However, I'm not using AVAudioSession, I'm using an AVAudioEngine.

So basically my question is, even though I'm using an AVAudioEngine, should I still have an AVAudioSession?

If so, what is the relationship between these two objects? Or is there a way to connect an AVAudioEngine to an AVAudioSession?


If this is not the case, and there is no relation between an AVAudioEngine and an AVAudioSession, than how do you reroute audio so that it plays out of the main speakers on an iOS device rather than the earpiece.

Thank you!

Upvotes: 13

Views: 2777

Answers (2)

Sunjoong Kevin Kim
Sunjoong Kevin Kim

Reputation: 106

Yes it is not clearly commented , however, I found this comment from ios developer documentation.

AVFoundation playback and recording classes automatically activate your audio session.

Document Link : https://developer.apple.com/library/content/documentation/Audio/Conceptual/AudioSessionProgrammingGuide/ConfiguringanAudioSession/ConfiguringanAudioSession.html

I hope this may help you.

Upvotes: 2

Return Zero
Return Zero

Reputation: 434

AVAudioSession is specific to iOS and coordinates audio playback between apps, so that, for example, audio is stopped when a call comes in, or music playback stops when the user starts a movie. This API is needed to make sure an app behaves correctly in response to such events

AVAudioEngine is a modern Objective-C API for playback and recording. It provides a level of control for which you previously had to drop down to the C APIs of the Audio Toolbox framework (for example, with real-time audio tasks). The audio engine APIs are built to interface well with lower-level APIs, so you can still drop down to Audio Toolbox if you have to.

The basic concept of this API is to build up a graph of audio nodes, ranging from source nodes (players and microphones) and overprocessing nodes (mixers and effects) to destination nodes (hardware outputs). Each node has a certain number of input and output busses with well-defined data formats. This architecture makes it very flexible and powerful. And it even integrates with audio units.

so there is no inclusive relation between this .

Source Link : https://www.objc.io/issues/24-audio/audio-api-overview/

Upvotes: 9

Related Questions