Paul Masri-Stone
Paul Masri-Stone

Reputation: 3139

Can code using Core Audio be compatible across iOS & macOS?

Core Audio is the audio framework for both iOS and macOS. From Apple's own documentation, I can see there are differences in the implementation. Most notable appear to be the sample format (fixed point for iOS vs. 32-bit floating point for macOS) and that iOS does not support all aspects of Core Audio.

When writing code to achieve something that both platforms do support, is it possible to write the code once and truly port it over? If the answer is "Yes, but only in certain aspects", please explain more.

Let's take an example of live audio synthesis. I want to open an audio output stream and use the callback method to place samples in the output buffer. I know that this can be done on both iOS and macOS, but when I look for libraries, they don't seem to support both. Could these libraries actually support both platforms or is there a fundamental reason that blocks this?

For example:

Upvotes: 2

Views: 1038

Answers (1)

dave234
dave234

Reputation: 4955

The canonical sample format is now stereo float 32 on iOS too.

MacOS supports custom v3 and v2 Audio units, while iOS supports custom v3 audio units, but only system provided v2 audio units.

AVAudioEngine and friends wrap much of the core audio C API in Swift/ObjC and I believe there are very few platform differences if any. I recommend trying AVAudioEngine first, then use the C API if it doesn't meet your needs.

Much of the C API is cross platform, but there are some areas where something is supported on macOS only, or iOS only. You can look through the headers to see the differences. For example, here are the definitions for the output audio units sub-types (with documentation removed).

#if !TARGET_OS_IPHONE

CF_ENUM(UInt32) {
    kAudioUnitSubType_HALOutput             = 'ahal',
    kAudioUnitSubType_DefaultOutput         = 'def ',
    kAudioUnitSubType_SystemOutput          = 'sys ',
};

#else

CF_ENUM(UInt32) {
    kAudioUnitSubType_RemoteIO              = 'rioc',
};

#endif

If you want to write a cross-platform wrapper, you have to use preprocessor directives around the platform specifics. Here is a cross-platform function that creates an AudioComponentDescription for an output audio unit using the platform specific sub-types.

AudioComponentDescription outputDescription() {

    AudioComponentDescription description;
    description.componentType = kAudioUnitType_Output;
    description.componentManufacturer = kAudioUnitManufacturer_Apple;
    description.componentFlags = 0;
    description.componentFlagsMask = 0;

#if TARGET_OS_IPHONE
    description.componentSubType = kAudioUnitSubType_RemoteIO;
#else
    description.componentSubType = kAudioUnitSubType_DefaultOutput;
#endif

    return description;
}

There are some other audio units that are only supported on iOS or macOS, and the API that manages "system" level audio interaction is completely different. MacOS uses a C API, while iOS has AVAudioSession.

I'm sure I'm missing some things :)

Upvotes: 2

Related Questions