Sophie_S
Sophie_S

Reputation: 31

SwiftUI AudioKit iOS application freezing when opened from background

I am learning Swift and working on making my first app, a musical tuner that should work on both macOS and iOS. I'm using AudioKit for the sound-related processing.

The app seems to be working fine for me in both macOS and iOS, until I send it to the background in iOS and then re-open it later. It doesn't happen consistently so it's difficult to test, but it happens about half the time and especially after I've left the app in the background for a long time (multiple hours).

I'm guessing that this "freezing" is from the fact that AudioKit's AudioEngine() isn't started up properly after opening from background, since all my app does is show the current note, frequency, etc. So if the app isn't getting any sound information, there would be no UI updates.

Full reproduction code

This is the simplest possible complete code needed to reproduce the problem (and results in a screen showing what note is playing).

To see the full source code, here's my Github repository (at my most recent/current commit).

Here's what the complete UI looks like -- in this screenshot I had opened the app from the background and it stays frozen in exactly this position until I fully close the app and restart it.

Screenshot of UI

import AudioKit
import SoundpipeAudioKit // PitchTap
import AudioKitEX // Fader
import AVFoundation // AVAudioSession
import SwiftUI

struct ContentView: View {
    @StateObject var td = ToneDetector()
    var body: some View {
        VStack {
            Text(td.data.note)
                .font(.system(size: 100, design: .serif))
        }
        .task {
            await PermissionsChecker.getMicrophoneAccess()
        }
        .task {
            if !td.engine.avEngine.isRunning {
                td.start()
            }
        }
    }
}

struct TunerData {
    var note = "-"
}

class ToneDetector : ObservableObject, HasAudioEngine {
    @Published var data = TunerData()
    let engine = AudioEngine()
    
    let mic: AudioEngine.InputNode
    let tappableA: Fader
    let tappableB: Fader
    let silence: Fader
    var tracker: PitchTap!
    
    let noteFrequencies: [Float] = [16.35, 17.32, 18.35, 19.45, 20.6, 21.83, 23.12, 24.5, 25.96, 27.5, 29.14, 30.87]
    let noteNamesFlats = ["C", "C♯", "D", "D♯", "E", "F", "F♯", "G", "G♯", "A", "A♯", "B"]
    let noteNamesSharps = ["C", "D♭", "D", "E♭", "E", "F", "G♭", "G", "A♭", "A", "B♭", "B"]
    
    init() {
        guard let inputTmp = engine.input else { fatalError() }
        mic = inputTmp

        tappableA = Fader(mic)
        tappableB = Fader(tappableA)
        silence = Fader(tappableB, gain: 0)
        engine.output = silence
        
        PermissionsChecker.setSessionParameters()
        
        tracker = PitchTap(mic) { pitch, amp in
            DispatchQueue.main.async {
                self.update(pitch[0], amp[0])
            }
        }
        tracker.start()
    }
    
    func update(_ pitch: AUValue, _ amp: AUValue) {
        guard amp > 0.07 else { return }
        
        // Get the frequency down or up to the range that we know
        var frequency = pitch
        while frequency > Float(noteFrequencies[noteFrequencies.count - 1]) {
            frequency /= 2.0
        }
        while frequency < Float(noteFrequencies[0]) {
            frequency *= 2.0
        }
        
        // Find the known note frequency we are CLOSEST to
        var minDistance: Float = 10000.0
        var index = 0
        for possibleIndex in 0..<noteFrequencies.count {
            let distance = fabsf(Float(noteFrequencies[possibleIndex]) - frequency)
            if distance < minDistance {
                index = possibleIndex
                minDistance = distance
            }
        }
        if noteNamesFlats[index] == noteNamesSharps[index] {
            data.note = "\(noteNamesFlats[index])"
        } else {
            data.note = "\(noteNamesFlats[index]) / \(noteNamesSharps[index])"
        }
    }
}

class PermissionsChecker {
    
    static func getMicrophoneAccess() async {
        if #available(iOS 17.0, *) {
            let permission = AVAudioApplication.shared.recordPermission
            switch permission {
                case .granted: return
                case .denied: print("Microphone permission not granted.")
                case .undetermined: break
                default: break
            }
            
            await AVAudioApplication.requestRecordPermission()
        }
    }
    
    static func setSessionParameters() {
        #if os(iOS)
            do {
                Settings.bufferLength = .short
                try AVAudioSession.sharedInstance().setPreferredIOBufferDuration(Settings.bufferLength.duration)
                try AVAudioSession.sharedInstance().setCategory(.playAndRecord, mode: .default, options: [.defaultToSpeaker, .mixWithOthers, .allowBluetooth])
                try AVAudioSession.sharedInstance().setActive(true)
            } catch {
                print("Failed to configure AVAudioSession.")
            }
        #endif
    }
}

Please help! Anyone have ideas of why the app is hanging in this use case? It's not necessarily an issue with starting and stopping AudioKit, but that seems the most likely at this point.

I've tried several methods of getting the AudioEngine to start and stop properly when the app is open vs not open.

Attempt 1: .onAppear, .onDisappear

struct ContentView: View {
    let engine = AudioEngine()
    var body: some View {
        VStack {
            ...
        }
        .onAppear() {
            engine.start()
        }
        .onDisappear() {
            engine.stop()
        }
    }
}

Attempt 2: scenePhase

struct ContentView: View {
    let engine = AudioEngine()
    @Environment(\.scenePhase) var scenePhase
    var body: some View {
        VStack {
            ...
        }
        .onChange(of: scenePhase) { oldPhase, newPhase in
            if newPhase == .active {
                if !engine.avEngine.isRunning {
                    engine.start()
                }
            } else if newPhase == .inactive {
                if engine.avEngine.isRunning {
                    engine.stop()
                }
            } else if newPhase == .background {
                if engine.avEngine.isRunning {
                    engine.stop()
                }
            }
        }
    }
}

Attempt 3: Just leave it running

struct ContentView: View {
    let engine = AudioEngine()
    var body: some View {
        VStack {
            ...
        }
        .task {
            if !engine.avEngine.isRunning {
                engine.start()
            }
        }
    }
}

None of the above solutions worked for me, and I still get the "hanging" when I open the iOS app from the background.

Note that I simplified the examples above a bit to make them more readable, for example engine.start() needs to be wrapped in a do/catch block.

Upvotes: 3

Views: 117

Answers (0)

Related Questions