Legoless
Legoless

Reputation: 11112

Digital Audio Workstation Architecture on iOS

I am developing an architecture for digital audio workstation that works on iOS (mainly, but trying to support OS X too). I'm going slowly through miles of documentation by Apple and references of their frameworks.

I have experience with DSP, but iOS is more new to me and there are so many objects, tutorials (even for older versions of iOS) and different frameworks with different API's. I would just like to make sure I choose the right one on start, or combination of those.

The goals of the architecture are:

I hope I did not miss anything, but those are the most important goals.

My research

I have looked through most of the frameworks (not so much in detail though) and here is what I have figured out. Apple lists following frameworks for using Audio on iOS:

Media Player and AV Foundation are too high-level API's and do not allow direct sample access. OpenAL on the other side cannot record audio. So that leaves Audio Toolbox and Audio Unit frameworks. Many of the differences are explained here: What's the difference between all these audio frameworks?

As much as I can understand, Audio Toolbox would be the way to go, since MIDI is currently not required. But there is very little information and tutorials on Audio Toolbox for more professional control, such as recording, playing, etc. There is much more on Audio Units though.

My first question: What exactly are Audio Queue Services and what framework they belong to?

And then the final question:

Which framework should be used to be able to achieve most of the desired goals?

You can suggest even mix and match of frameworks, classes, but I ask you kindly, to explain your answer and which classes would you use to achieve a goal in more detail. I encourage highest level API as possible, but as low level as it is needed to achieve the goals. Sample code links are also welcome.

Thank you very much for your help.

Upvotes: 2

Views: 703

Answers (1)

hotpaw2
hotpaw2

Reputation: 70693

Audio Units is the lowest level iOS audio API, and the API that Audio Queues are built upon. And Audio Units will provide an app with the lowest latency, and thus closest to real-time processing possible. It is a C API though, so an app may have to do some of its own audio memory management.

The AVFoundation framework may provide an app with easier access to music library assets.

An app can only process sound from other apps that explicitly publish their audio data, which does not include the Music player app, but does include some of the apps using Apple's Inter-App Audio API, and the 3rd party Audiobus API.

Upvotes: 4

Related Questions