Reputation: 7606
I have run through an audio units tutorial for a sine wave generator and done a bit of reading, and I understand basically how it is working. What I would actually like to do for my app, is play a short sound file in response to some external event. These sounds would be about 1-2 seconds in duration and occur at a rate of about about 1-2 per second.
Basically where I am at right now is trying to figure out how to play an actual audio file using my audio unit, rather than generating a sine wave. So basically my question is, how do I get an audio unit to play an audio file?
Do I simply read bytes from the audio file into the buffer in the render callback? (if so what class do I need to deal with to open / convert / decompress / read the audio file)
or is there some simpler method where I could maybe just hand off the entire buffer and tell it to play?
Any names of specific classes or APIs I will need to look at to accomplish this would be very helpful.
Upvotes: 2
Views: 1353
Reputation: 362
If you don't mind being tied to IOS 5+, you should look into AUFilePlayer. It is much easer then using the callbacks and you don't have to worry about setting up your own ring buffer (something that you would need to do if you want to avoid loading all of your audio data into memory on start-up)
Upvotes: 1
Reputation: 16246
OK, check this: http://developer.apple.com/library/ios/samplecode/MixerHost/Introduction/Intro.html
EDIT: That is a sample project. This page has detailed instructions with inline code to setup common configurations: http://developer.apple.com/library/ios/ipad/#DOCUMENTATION/MusicAudio/Conceptual/AudioUnitHostingGuide_iOS/ConstructingAudioUnitApps/ConstructingAudioUnitApps.html#//apple_ref/doc/uid/TP40009492-CH16-SW1
Upvotes: 3