Matías Insaurralde
Matías Insaurralde

Reputation: 1222

AVAudioFile from memory?

I'm playing around with AVAudioFile and AVAudioPlayerNode.

I have implemented a custom streaming protocol that receives audio chunks and handles all the buffering part, however I haven't been able to play the the first chunk without using the filesystem (I don't want to use the FS at this time).

This is: it works fine when I write the data to a temporary file and load it using AVAudioFile (then I initialize a AVAudioBuffer and use AVAudioFile.readIntoBuffer & finally AVAudioPlayerNode.scheduleBuffer). But I'm not able to load the buffer directly from the first chunk (NSData).

Should I implement a custom NSURLProtocol and try to initialize the AVAudioFile from a custom NSURL?

func play( data: NSData ) {
var audioFile: AVAudioFile
var audioBuffer: AVAudioPCMBuffer
// writing the data
let destPath = NSTemporaryDirectory() + "chunk.mp3"
data.writeToFile( destPath, atomically: true )
// initializing the AVAudioFile
audioFile = AVAudioFile( forReading: NSURL(string: destPath)!)
// initializing the AVAudioPCMBuffer
audioBuffer = AVAudioPCMBuffer(PCMFormat: audioFile.processingFormat,
                               frameCapacity: UInt32(data.length))
audioFile.readIntoBuffer( audioBuffer )
/*
    I didn't put any AVAudioPlayerNode/AVAudioEngine in this code to keep it simple and describe the situation,
    anyway this works fine when everything's properly initialized:
*/
player.scheduleBuffer( audioBuffer, completionHandler: nil )
}

Upvotes: 4

Views: 2147

Answers (1)

quellish
quellish

Reputation: 21244

From the documentation:

To play streamed audio content, such as from a network connection, use Audio File Stream Services in concert with Audio Queue Services. Audio File Stream Services parses audio packets and metadata from common audio file container formats in a network bitstream. You can also use it to parse packets and metadata from on-disk files.

The key point here is to use Audio File Services to get the data into buffers and enqueue it using Audio Queue Services. The networking portion is typically done using the CFNetwork APIs, which is much more low level than NSURLProtocol.

The Apple sample code "AudioFileStreamExample" illustrates a client and server implementation that streams audio. This would be a valid starting point.

Upvotes: 2

Related Questions