Reputation: 21
I've got an iPhone app that uses 140 UIButtons (PNG images), each triggering synthesised Sine tones (maximum of 10 tones). The Sine tones are generated using highly optimised C code (using AudioToolbox API). 10 Sine tones chews about 50% CPU (at 44100hz).
I have presets that turn off 10 tones, and turn on 10 new different tones. This also causes the UIButtons to change state from Selected to Normal.
My problem is that when 10 (out of 140) UIButtons change state (all at the same time) it causes a glitch in the audio (even though the audio only chews 50% cpu). So there is a big spike in CPU caused by changing
Is there a way I can handle this? Can I prioritise AudioQueue over the GUI? Is it because I have 140 PNG images as buttons? Is it possible to create multi-touch zones that are not buttons? I'm just looking for any advice to reduce the GUI's impact on audio processing.
Upvotes: 1
Views: 170
Reputation: 21
If one is developing audio synthesis applications, then the recommendation from Apple is to use the AudioUnit API instead of the AudioToolbox layer. Apple says one of the reasons to use the AudioUnit API is for:
Responsive playback of synthesized sounds, such as for musical games or synthesized musical instruments
Its threads have very high priority by default. The AudioToolbox threads have a low priority. Would be better to migrate to using the AudioUnit API rather than forcing thread priorities for the AudioToolbox.
Upvotes: 1
Reputation: 16709
You have to perform audio playback in a separate thread. You can set priority for the newly created thread using setThreadPriority:.
Upvotes: 0