lololarumpel
lololarumpel

Reputation: 57

AudioKit playback cracks

I want to analyze the microphone input frequency and then play the correct note which is near the frequency which was determined. I did that with of AudioKit.

This is working right now but since I implemented AudioKit to get the frequency feature the sound which plays after the frequency detection cracks sometimes during playback. Thats happened after I implemented AudioKit. Everything was fine before that...

var mic: AKMicrophone!
var tracker: AKFrequencyTracker!
var silence: AKBooster!

func initFrequencyTracker() {
        AKSettings.channelCount = 2
        AKSettings.audioInputEnabled = true
        AKSettings.defaultToSpeaker = true
        AKSettings.allowAirPlay = true
        AKSettings.useBluetooth = true
        AKSettings.allowHapticsAndSystemSoundsDuringRecording = true
        mic = AKMicrophone()
        tracker = AKFrequencyTracker(mic)
        silence = AKBooster(tracker, gain: 0)
    }

    func deinitFrequencyTracker() {
        AKSettings.audioInputEnabled = false
        plotTimer.invalidate()
        do {
            try AudioKit.stop()
            AudioKit.output = nil
        } catch {
            print(error)
        }
    }

    func initPlotTimer() {
        AudioKit.output = silence
        do {
            try AKSettings.setSession(category: .playAndRecord, with: [.defaultToSpeaker, .allowBluetooth, .allowAirPlay, .allowBluetoothA2DP])
            try AudioKit.start()
        } catch {
            AKLog("AudioKit did not start!")
        }
        setupPlot()
        plotTimer = Timer.scheduledTimer(timeInterval: 0.1, target: self, selector: #selector(updatePlotUI), userInfo: nil, repeats: true)
    }

    func setupPlot() {
        let plot = AKNodeOutputPlot(mic, frame: audioInputPlot.bounds)
        plot.translatesAutoresizingMaskIntoConstraints = false
        plot.alpha = 0.3
        plot.plotType = .rolling
        plot.shouldFill = true
        plot.shouldCenterYAxis = false
        plot.shouldMirror = true
        plot.color = UIColor(named: uiFarbe)
        audioInputPlot.addSubview(plot)

        // Pin the AKNodeOutputPlot to the audioInputPlot
        var constraints = [plot.leadingAnchor.constraint(equalTo: audioInputPlot.leadingAnchor)]
        constraints.append(plot.trailingAnchor.constraint(equalTo: audioInputPlot.trailingAnchor))
        constraints.append(plot.topAnchor.constraint(equalTo: audioInputPlot.topAnchor))
        constraints.append(plot.bottomAnchor.constraint(equalTo: audioInputPlot.bottomAnchor))
        constraints.forEach { $0.isActive = true }
    }

    @objc func updatePlotUI() {
        if tracker.amplitude > 0.3 {
            let trackerFrequency = Float(tracker.frequency)

            guard trackerFrequency < 7_000 else {
                // This is a bit of hack because of modern Macbooks giving super high frequencies
                return
            }



            var frequency = trackerFrequency
            while frequency > Float(noteFrequencies[noteFrequencies.count - 1]) {
                frequency /= 2.0
            }
            while frequency < Float(noteFrequencies[0]) {
                frequency *= 2.0
            }

            var minDistance: Float = 10_000.0
            var index = 0

            for i in 0..<noteFrequencies.count {
                let distance = fabsf(Float(noteFrequencies[i]) - frequency)
                if distance < minDistance {
                    index = i
                    minDistance = distance
                }
                print(minDistance, distance)
            }
            //                let octave = Int(log2f(trackerFrequency / frequency))

            frequencyLabel.text = String(format: "%0.1f", tracker.frequency)

            if frequencyTranspose(note: notesToTanspose[index]) != droneLabel.text {
                momentaneNote = frequencyTranspose(note: notesToTanspose[index])
                droneLabel.text = momentaneNote
                stopSinglePlayer()
                DispatchQueue.main.asyncAfter(deadline: .now() + 0.03, execute: {
                    self.prepareSinglePlayerFirstForStart(note: self.momentaneNote)
                    self.startSinglePlayer()
                })
            }

        }
    }

    func frequencyTranspose(note: String) -> String {
        var indexNote = notesToTanspose.firstIndex(of: note)!
        let chosenInstrument = UserDefaults.standard.object(forKey: "whichInstrument") as! String
        if chosenInstrument == "Bb" {
            if indexNote + 2 >= notesToTanspose.count {
                indexNote -= 12
            }
            return notesToTanspose[indexNote + 2]
        } else if chosenInstrument == "Eb" {
            if indexNote - 3 < 0 {
                indexNote += 12
            }
            return notesToTanspose[indexNote - 3]
        } else {
            return note
        }
    }

Upvotes: 0

Views: 107

Answers (1)

punkbit
punkbit

Reputation: 7717

Appears that your implementation can be improved slightly by putting the multithreading principles of iOS into practice. Now, I'm not an expert in the subject, but if we look into the statement: "the sound which plays after the frequency detection cracks sometimes during playback".

I'd like to point out that the "frequency" of the "crack" is random or unpredictable and this happens during computation.

So, move code that doesn't need to be computed in the main thread to a background thread (https://developer.apple.com/documentation/DISPATCH)

enter image description here

While refactoring, you can test your implementation by increasing the frequency of calls to the callback computation of your Timer, so reduce the value to 0.05 for example. Which means that if you increase the frequency to, let's say 0.2, you'll probably hear less random crackles.

Now, this is easier said than done when considering concurrency but that's what you need to improve.

Upvotes: 0

Related Questions