lucianoenrico
lucianoenrico

Reputation: 1494

List available output audio target AVAudioSession

I need to list the audio outputs available to an iOS application. My question is similar to this one: How to list available audio output route on iOS

i tried this code:

NSError *setCategoryError = nil;
BOOL success = [[AVAudioSession sharedInstance] setCategory: AVAudioSessionCategoryPlayback
                                                      error: &setCategoryError];

NSError *activationError = nil;
[[AVAudioSession sharedInstance] setActive: YES error: &activationError];

…
NSLog(@"session.currentRoute.outputs count %d", [[[[AVAudioSession sharedInstance] currentRoute] outputs ] count]);
for (AVAudioSessionPortDescription *portDesc in [[[AVAudioSession sharedInstance] currentRoute] outputs ]) {
    NSLog(@"-----");
    NSLog(@"portDesc UID %@", portDesc.UID);
    NSLog(@"portDesc portName %@", portDesc.portName);
    NSLog(@"portDesc portType %@", portDesc.portType);
    NSLog(@"portDesc channels %@", portDesc.channels);
}

However I always see just one output port (the count is 1), also if I have two (an Airplay and a Built-in speaker). If I use the Music application I am able to see both ports and switch between them. In my app I only see the one that I have selected.

There is something else that I need to do?

Thank you

EDIT:

i tried this code, too:

CFDictionaryRef asCFType = nil;
UInt32 dataSize = sizeof(asCFType);
AudioSessionGetProperty(kAudioSessionProperty_AudioRouteDescription, &dataSize, &asCFType);
NSDictionary *audioRoutesDesc = (__bridge NSDictionary *)asCFType;
NSLog(@"audioRoutesDesc %@", audioRoutesDesc);

but the dictionary list just one output destinations. Moreover the input sources array is empty (I have a iPhone 4s)

EDIT2:

I got something working using MPVolumeView . This component has a button that lets you choose the output audio route, like in the Music App.

If you want you can hide the slider (and have only the button) using:

self.myMPVolumeView.showsVolumeSlider = NO;

Upvotes: 29

Views: 19438

Answers (4)

Suman
Suman

Reputation: 31

Please check this out for the complete working code.

Complete code for audio session, output device handle and show in an action sheet.

There are following files available in the given link. Below are brief description for each files.

AVAudioSessionHandler.swift ->

All the required methods are available, which are useful for override the routes according to the selected output device.

AudioOutputDeviceHandler.swift->

All the required methods are available, which are useful for getting the list of input devices, the current output device and to show the action sheet for all the available input devices.

SpeakerUIHandler.swift ->

All the required methods are available, which are useful for updating the Speaker UI according to the selected output device.

AudioSession.swift ->

All the methods are available, which are useful for creating the audio session and setting all the required parameters for audio session.

Please check the below code for list of available input device list.

extension AVAudioSession {
    
    @objc func ChangeAudioOutput(_ presenterViewController : UIViewController, _ speakerButton: UIButton) {
        
        let CHECKED_KEY = "checked"
        var deviceAction = UIAlertAction()
        var headphonesExist = false
        
        if AudioOutputDeviceHandler.sharedInstance.isDeviceListRequired() {
            
            let optionMenu = UIAlertController(title: nil, message: nil, preferredStyle: .actionSheet)
            
            for audioPort in self.availableInputs!{
                
                switch audioPort.portType {
                    
                case AVAudioSession.Port.bluetoothA2DP, AVAudioSession.Port.bluetoothHFP, AVAudioSession.Port.bluetoothLE :
                    
                    overrideBluetooth(audioPort, optionMenu)
                    break
                    
                case AVAudioSession.Port.builtInMic, AVAudioSession.Port.builtInReceiver:
                    
                    deviceAction = overrideBuiltInReceiver(audioPort)
                    break
                    
                case AVAudioSession.Port.headphones, AVAudioSession.Port.headsetMic:
                    
                    headphonesExist = true
                    overrideheadphones(audioPort,optionMenu)
                    break
                    
                case AVAudioSession.Port.carAudio:
                    overrideCarAudio(port: audioPort, optionMenu: optionMenu)
                    break
                    
                default:
                    break
                }
            }
            
            if !headphonesExist {
                
                if self.currentRoute.outputs.contains(where: {return $0.portType == AVAudioSession.Port.builtInReceiver}) || self.currentRoute.outputs.contains(where: {return $0.portType == AVAudioSession.Port.builtInMic}) {
                    deviceAction.setValue(true, forKey: CHECKED_KEY)
                }
                optionMenu.addAction(deviceAction)
            }
            
            overrideSpeaker(optionMenu)
            
            let cancelAction = UIAlertAction(title: "Cancel", style: .cancel, handler: {
                (alert: UIAlertAction!) -> Void in
                
            })
            
            optionMenu.addAction(cancelAction)
            
            alertViewSetupForIpad(optionMenu, speakerButton)
            presenterViewController.present(optionMenu, animated: false, completion: nil)
            
            // auto dismiss after 5 seconds
            DispatchQueue.main.asyncAfter(deadline: .now() + 5.0) {
                optionMenu.dismiss(animated: true, completion: nil)
            }
            
        } else {
            if self.isBuiltInSpeaker {
                
                if AudioOutputDeviceHandler.sharedInstance.isSpeaker {
                    let port = self.currentRoute.inputs.first!
                    setPortToNone(port)
                    AudioOutputDeviceHandler.sharedInstance.isSpeaker = false
                }
            }
            else if self.isReceiver || self.isBuiltInMic  || self.isHeadphonesConnected {
                
                setPortToSpeaker()
                AudioOutputDeviceHandler.sharedInstance.isSpeaker = true
            }
        }
    }
    
    func overrideCarAudio(port: AVAudioSessionPortDescription, optionMenu: UIAlertController) {
        
        let action = UIAlertAction(title: port.portName, style: .default) { (action) in
            do {
                // set new input
                try self.setPreferredInput(port)
            } catch let error as NSError {
                print("audioSession error change to input: \(port.portName) with error: \(error.localizedDescription)")
            }
        }
        
        if self.currentRoute.outputs.contains(where: {return $0.portType == port.portType}){
            action.setValue(true, forKey: "checked")
        }
        
        if let image = UIImage(named: "CarAudio") {
            action.setValue(image, forKey: "image")
        }
        optionMenu.addAction(action)
    }
    
    func overrideheadphones(_ port: AVAudioSessionPortDescription, _ optionMenu: UIAlertController) {
        
        let CHECKED_KEY = "checked"
        let HEADPHONES_TITLE = "Headphones"
        let action = UIAlertAction(title: HEADPHONES_TITLE, style: .default) { (action) in
            do {
                // set new input
                try self.setPreferredInput(port)
            } catch let error as NSError {
                print("audioSession error change to input: \(port.portName) with error: \(error.localizedDescription)")
            }
        }
        
        if self.currentRoute.outputs.contains(where: {return $0.portType == AVAudioSession.Port.headphones}) || self.currentRoute.outputs.contains(where: {return $0.portType == AVAudioSession.Port.headsetMic}) {
            action.setValue(true, forKey: CHECKED_KEY)
        }
        
        if let image = UIImage(named: "Headphone") {
            action.setValue(image, forKey: "image")
        }
        
        optionMenu.addAction(action)
    }
    
    func overrideSpeaker(_ optionMenu: UIAlertController) {
        
        let SPEAKER_TITLE = "Speaker"
        let CHECKED_KEY = "checked"
        let speakerOutput = UIAlertAction(title: SPEAKER_TITLE, style: .default, handler: {
            [weak self] (alert: UIAlertAction!) -> Void in
            self?.setPortToSpeaker()
        })
        AudioOutputDeviceHandler.sharedInstance.isSpeaker = true
        
        if self.currentRoute.outputs.contains(where: {return $0.portType == AVAudioSession.Port.builtInSpeaker}){
            
            speakerOutput.setValue(true, forKey: CHECKED_KEY)
        }
        
        if let image = UIImage(named: "Speaker") {
            speakerOutput.setValue(image, forKey: "image")
        }
        optionMenu.addAction(speakerOutput)
    }
    
    func overrideBluetooth(_ port: AVAudioSessionPortDescription, _ optionMenu: UIAlertController) {
        
        let CHECKED_KEY = "checked"
        let action = UIAlertAction(title: port.portName, style: .default) { (action) in
            do {
                // set new input
                try self.setPreferredInput(port)
            } catch let error as NSError {
                print("audioSession error change to input: \(port.portName) with error: \(error.localizedDescription)")
            }
        }
        
        if self.currentRoute.outputs.contains(where: {return $0.portType == port.portType}){
            action.setValue(true, forKey: CHECKED_KEY)
        }
        if let image = UIImage(named: "Bluetooth") {
            action.setValue(image, forKey: "image")
        }
        optionMenu.addAction(action)
    }
    
    func overrideBuiltInReceiver(_ port: AVAudioSessionPortDescription) -> UIAlertAction {
        
        let IPHONE_TITLE = "iPhone"
        let deviceAction = UIAlertAction(title: IPHONE_TITLE, style: .default) {[weak self] (action) in
            self?.setPortToNone(port)
        }
        
        if let image = UIImage(named: "Device") {
            deviceAction.setValue(image, forKey: "image")
        }
        return deviceAction
    }
    
    func setPortToSpeaker() {
        
        do {
            try self.overrideOutputAudioPort(AVAudioSession.PortOverride.speaker)
        } catch let error as NSError {
            print("audioSession error turning on speaker: \(error.localizedDescription)")
        }
    }
    
    func setPortToNone(_ port: AVAudioSessionPortDescription) {
        
        do {
            // remove speaker if needed
            try self.overrideOutputAudioPort(AVAudioSession.PortOverride.none)
            // set new input
            try self.setPreferredInput(port)
        } catch let error as NSError {
            print("audioSession error change to input: \(AVAudioSession.PortOverride.none.rawValue) with error: \(error.localizedDescription)")
        }
    }
    
    func alertViewSetupForIpad(_ optionMenu: UIAlertController, _ speakerButton: UIButton) {
        optionMenu.modalPresentationStyle = .popover
        if let presenter = optionMenu.popoverPresentationController {
            presenter.sourceView = speakerButton;
            presenter.sourceRect = speakerButton.bounds;
        }
    }
}

extension AVAudioSession {
    
    static var isHeadphonesConnected: Bool {
        return sharedInstance().isHeadphonesConnected
    }
    
    static var isBluetoothConnected: Bool {
        return sharedInstance().isBluetoothConnected
    }
    
    static var isCarAudioConnected: Bool {
        return sharedInstance().isCarAudioConnected
    }
    
    static var isBuiltInSpeaker: Bool {
        return sharedInstance().isBuiltInSpeaker
    }
    
    static var isReceiver: Bool {
        return sharedInstance().isReceiver
    }
    
    static var isBuiltInMic: Bool {
        return sharedInstance().isBuiltInMic
    }
    
    var isCarAudioConnected: Bool {
        return !currentRoute.outputs.filter { $0.isCarAudio }.isEmpty
    }
    
    var isHeadphonesConnected: Bool {
        return !currentRoute.outputs.filter { $0.isHeadphones }.isEmpty
    }
    
    var isBluetoothConnected: Bool {
        return !currentRoute.outputs.filter { $0.isBluetooth }.isEmpty
    }
    
    var isBuiltInSpeaker: Bool {
        return !currentRoute.outputs.filter { $0.isSpeaker }.isEmpty
    }
    
    var isReceiver: Bool {
        return !currentRoute.outputs.filter { $0.isReceiver }.isEmpty
    }
    
    var isBuiltInMic: Bool {
        return !currentRoute.outputs.filter { $0.isBuiltInMic }.isEmpty
    }
}

extension AVAudioSessionPortDescription {
    
    var isHeadphones: Bool {
        return portType == AVAudioSession.Port.headphones  ||  portType == AVAudioSession.Port.headsetMic
    }
    
    var isBluetooth: Bool {
        return portType == AVAudioSession.Port.bluetoothHFP || portType == AVAudioSession.Port.bluetoothA2DP || portType == AVAudioSession.Port.bluetoothLE
    }
    
    var isCarAudio: Bool {
        return portType == AVAudioSession.Port.carAudio
    }
    
    var isSpeaker: Bool {
        return portType == AVAudioSession.Port.builtInSpeaker
    }
    
    var isBuiltInMic: Bool {
        return portType == AVAudioSession.Port.builtInMic
    }
    
    var isReceiver: Bool {
        return portType == AVAudioSession.Port.builtInReceiver
    }
}

Upvotes: 2

Mike
Mike

Reputation: 1333

It will depend on your AVAudioSession category.

You can safely assume on an iPhone that you have at least a microphone as input and a speaker as output. If you're trying to get a list of Bluetooth/AirPlay outputs, first you'd have to make sure your session category is reporting them to you:

do 
{
    try audioSession.setCategory(AVAudioSessionCategoryPlayAndRecord, withOptions: .AllowBluetooth)
    try audioSession.setActive(true)
} 
catch let e
{
    debugPrint("failed to initialize audio session: \(e)")
}

Then a non-intuitive way to get available outputs is to check AVAudioSession.availableInputs as usually a bluetooth HFP device would have a mic too.. I might be assuming a lot right now.. but that's the only way to consistently get your availableOutputs.

A better way is to use MultipleRoute category which will give you more freedom at accessing AVAudioSessionPort

Upvotes: 3

user1079052
user1079052

Reputation: 3833

AVAudioSessionRouteDescription *currentRoute = [[AVAudioSession sharedInstance] currentRoute];
    for (AVAudioSessionPortDescription *output in currentRoute.outputs) {

    }

Upvotes: 1

Andrew Smith
Andrew Smith

Reputation: 2929

Try something like this, its more than you need but you can pare it down:

    + (NSString *) demonstrateInputSelection
{
    NSError* theError = nil;
    BOOL result = YES;
    NSMutableString *info = [[NSMutableString alloc] init];
    [info appendString: @"     Device Audio Input Hardware\n"];

    NSString *str = nil;
    if( iOSMajorVersion < 7 ){
        str = @"No input device information available";
        NSLog(@"%@",str);
        [info appendFormat:@"%@\n",str];

        return info;
    }

    AVAudioSession* myAudioSession = [AVAudioSession sharedInstance];

    result = [myAudioSession setCategory:AVAudioSessionCategoryPlayAndRecord error:&theError];
    if (!result)
    {
        NSLog(@"setCategory failed");
    }

    result = [myAudioSession setActive:YES error:&theError];
    if (!result)
    {
        NSLog(@"setActive failed");
    }

    // Get the set of available inputs. If there are no audio accessories attached, there will be
    // only one available input -- the built in microphone.
    NSArray* inputs = [myAudioSession availableInputs];
    str = [NSString stringWithFormat:@"\n--- Ports available on %@: %d ---", [UIDevice currentDevice].name , [inputs count]];
    NSLog(@"%@",str);
    [info appendFormat:@"%@\n",str];

    // Locate the Port corresponding to the built-in microphone.
    AVAudioSessionPortDescription* builtInMicPort = nil;
    AVAudioSessionDataSourceDescription* frontDataSource = nil;

    for (AVAudioSessionPortDescription* port in inputs)
    {
        // Print out a description of the data sources for the built-in microphone
        str = @"\n**********";
        NSLog(@"%@",str);
        [info appendFormat:@"%@\n",str];
        str = [NSString stringWithFormat:@"Port :\"%@\": UID:%@", port.portName, port.UID ];
        NSLog(@"%@",str);
        [info appendFormat:@"%@\n",str];
        if( [port.dataSources count] ){
            str = [NSString stringWithFormat:@"Port has %d data sources",(unsigned)[port.dataSources count] ];
            NSLog(@"%@",str);
            [info appendFormat:@"%@\n",str];
        }

        str = [NSString stringWithFormat:@">%@", port.dataSources];
        NSLog(@"%@",str);
   //     [info appendFormat:@"%@\n",str];

        if( [port.portType isEqualToString:AVAudioSessionPortLineIn] ){
            str = @"Line Input found";
            NSLog(@"%@",str);
            [info appendFormat:@"%@\n",str];
        }
        else if( [port.portType isEqualToString:AVAudioSessionPortUSBAudio] ){
            str = @"USB Audio found";
            NSLog(@"%@",str);
            [info appendFormat:@"%@\n",str];
        }
        else if ([port.portType isEqualToString:AVAudioSessionPortBuiltInMic]){
            builtInMicPort = port;
            str = @"Built-in Mic found";
            NSLog(@"%@",str);
            [info appendFormat:@"%@\n",str];
        }
        else if ([port.portType isEqualToString:AVAudioSessionPortHeadsetMic]){
            builtInMicPort = port;
            str = @"Headset Mic found";
            NSLog(@"%@",str);
            [info appendFormat:@"%@\n",str];
        }
        else{
            str = @"Other input source found";
            NSLog(@"%@",str);
            [info appendFormat:@"%@\n",str];
        }

        // loop over the built-in mic's data sources and attempt to locate the front microphone
        for (AVAudioSessionDataSourceDescription* source in port.dataSources)
        {
            str = [NSString stringWithFormat:@"\nName:%@ (%d) \nPolar:%@ \nType:%@ \nPatterns:%@", source.dataSourceName, [source.dataSourceID intValue], source.selectedPolarPattern, port.portType, source.supportedPolarPatterns];
            NSLog(@"%@",str);
            [info appendFormat:@"%@\n",str];

            //           if ([source.orientation isEqual:AVAudioSessionOrientationFront])
            //           {
            //               frontDataSource = source;
            //               break;
            //           }
        } // end data source iteration

    }

    str = @"\n----  Current Selected Ports ----\n";
    NSLog(@"%@",str);
    [info appendFormat:@"%@",str];

    NSArray *currentInputs = myAudioSession.currentRoute.inputs;
//    str = [NSString stringWithFormat:@"\n%d current input ports", [currentInputs count]];
//    NSLog(@"%@",str);
//    [info appendFormat:@"%@\n",str];
    for( AVAudioSessionPortDescription *port in currentInputs ){
        str = [NSString stringWithFormat:@"\nInput Port :\"%@\":", port.portName ];
        NSLog(@"%@",str);
        [info appendFormat:@"%@\n",str];
        if( [port.dataSources count] ){
            str = [NSString stringWithFormat:@"Port has %d data sources",(unsigned)[port.dataSources count] ];
            NSLog(@"%@",str);
            [info appendFormat:@"%@\n",str];

            str = [NSString stringWithFormat:@"Selected data source:%@",  port.selectedDataSource.dataSourceName];
            NSLog(@"%@",str);
            [info appendFormat:@"%@\n",str];

            if( [port.selectedDataSource.supportedPolarPatterns count] > 0 ){
                str = [NSString stringWithFormat:@"Selected polar pattern:%@", port.selectedDataSource.selectedPolarPattern];
                NSLog(@"%@",str);
                [info appendFormat:@"%@\n",str];
            }
        }
    }

    NSArray *currentOutputs = myAudioSession.currentRoute.outputs;
//    str = [NSString stringWithFormat:@"\n%d current output ports", [currentOutputs count]];
//    NSLog(@"%@",str);
//    [info appendFormat:@"%@\n",str];
    for( AVAudioSessionPortDescription *port in currentOutputs ){
        str = [NSString stringWithFormat:@"\nOutput Port :\"%@\":", port.portName ];
        NSLog(@"%@",str);
        [info appendFormat:@"%@\n",str];
        if( [port.dataSources count] ){
            str = [NSString stringWithFormat:@"Port has %d data sources",(unsigned)[port.dataSources count] ];
            NSLog(@"%@",str);
            [info appendFormat:@"%@\n",str];

            str = [NSString stringWithFormat:@"Selected data source:%@",  port.selectedDataSource.dataSourceName];
            NSLog(@"%@",str);
            [info appendFormat:@"%@\n",str];
        }

    }

//    str = [NSString stringWithFormat:@"\Current Route: %@ Source:%@\n", myAudioSession.currentRoute.portName, myAudioSession.preferredInput.selectedDataSource.dataSourceName];
//    NSLog(@"%@",str);
//    [info appendFormat:@"%@\n",str];


    if( myAudioSession.preferredInput.portName ){
        str = [NSString stringWithFormat:@"\nPreferred Port: %@ Source:%@\n", myAudioSession.preferredInput.portName, myAudioSession.preferredInput.selectedDataSource.dataSourceName];
    } else {
        str = @"\nNo Preferred Port set";
    }
    NSLog(@"%@",str);
    [info appendFormat:@"%@\n",str];

    return info;

    if (frontDataSource)
    {
        NSLog(@"Currently selected source is \"%@\" for port \"%@\"", builtInMicPort.selectedDataSource.dataSourceName, builtInMicPort.portName);
        NSLog(@"Attempting to select source \"%@\" on port \"%@\"", frontDataSource, builtInMicPort.portName);

        // Set a preference for the front data source.
        theError = nil;
        result = [builtInMicPort setPreferredDataSource:frontDataSource error:&theError];
        if (!result)
        {
            // an error occurred. Handle it!
            NSLog(@"setPreferredDataSource failed");
        }
    }

    // Make sure the built-in mic is selected for input. This will be a no-op if the built-in mic is
    // already the current input Port.
    theError = nil;
    result = [myAudioSession setPreferredInput:builtInMicPort error:&theError];
    if (!result)
    {
        // an error occurred. Handle it!
        NSLog(@"setPreferredInput failed");
    }

    return info;
}

Upvotes: 3

Related Questions