Reputation: 704
I want to integrate the FFTView from AudioKit into my app to show this cool visualiser.
The problem I am facing is that the audio that I am playing comes from a 2014 library (TritonSDK) which streams the audio from somewhere to provide you a radio station to listen to. So I have no control over the audio whatsoever, everything is handled by the SDK.
The SDK however, is exposing a AudioQueueRef
property, but from what I see in Apple's docs, there is no way to subscribe to audio data to it, if you're not controlling the creation of the queue.
So I thought, plan B: use AVAudioEngine to pick up the sound that is already playing, but unfortunately I don't get any data for the visualiser (I get back 0 value array of Floats)
Any idea what I might be doing wrong, or is this even possible what I'm trying? Thanks in advance!
My setup:
import AudioKit
import AVFAudio
final class AudioCapture: ObservableObject {
@Published private(set) var node: Node
private let engine = AVAudioEngine()
init() {
self.node = NodeWrapper(avAudioNode: engine.mainMixerNode)
}
func start() {
do {
try engine.start()
} catch {
print("Error starting AVAudioEngine: \(error.localizedDescription)")
}
if node.avAudioNode != engine.mainMixerNode {
node = NodeWrapper(avAudioNode: engine.mainMixerNode)
}
}
func stop() {
engine.stop()
}
}
// MARK: - Utility
private final class NodeWrapper: Node {
var connections: [Node] = []
let avAudioNode: AVAudioNode
init(avAudioNode: AVAudioNode) {
self.avAudioNode = avAudioNode
}
}
import SwiftUI
import AudioKitUI
struct RadioView: View {
@ObservedObject var viewModel: RadioViewModel
@Environment(\.safeAreaInsets) private var safeAreaInsets: EdgeInsets
var body: some View {
FFTView(viewModel.audioCapture.node)
}
}
Upvotes: 1
Views: 27