Reputation: 45
My basic requirement is to mix two or more audio files (Like a DJ kind of app).. I could do this using local audio files. But I would like to know whether I can use two or more spotify tracks to be played simultaneously..? or play my local file along with spotify track..?
So, all I want to do is get the songs from spotify to be played in my app's audio units that are capable of mixing rather than spotify's audio player..
PLease suggest.
Upvotes: 0
Views: 4007
Reputation: 2243
You might want to have a look at https://docs.audiostack.ai/docs/automixingservice. I'd be intrigued if this solves your problem
Upvotes: 0
Reputation: 18776
If you're using CocoaLibSpotify, the Spotify Mac/iOS library, you get given the raw PCM data stream of the decoded audio. The code included with the library uses Core Audio to get that data through to the speakers, but you can certainly add different Audio Units to the graph to achieve your desired result.
As an example, this blog post (by me) discusses adding a graphic EQ to the graph: Core Audio: AUGraph Basics in CocoaLibSpotify. That post could well be a good starting point for making a more complex graph to suit your needs.
It's worth mentioning that the Spotify service will only allow you to stream one audio track at a time, so you'll only be able to do live mixing of one Spotify track with local track(s).
Upvotes: 1
Reputation: 8667
If Spotify allows you to pull directly from an audio stream or file, then you can certainly use Core Audio to do this. Unfortunately, AV Foundation is unable to do any live audio processing (though this is changing in iOS 6), so you may be stuck having to use Core Audio to combine your audio streams (whether local or remote) and then use your audio units to do the actual audio processing.
Upvotes: 0