jhallmusic
jhallmusic

Reputation: 31

Getting audio visualization using Web Audio API to work on iOS

I'm developing an HTML5 audio player for use specifically on iPhones, and am trying to get an EQ visualizer working. From what I've found there are two ways to set this up:


One where you load the mp3 file on demand using an XMLHttpRequest:

var request = new XMLHttpRequest();
request.open('GET', 'sampler.mp3', true);
request.responseType = 'arraybuffer';
request.addEventListener('load', bufferSound, false);
request.send();

function bufferSound(event) {

   var request = event.target;
   var buffer = myAudioContext.createBuffer(request.response, false);
   source = myAudioContext.createBufferSource();
   source.buffer = buffer;

}

You then use the source.noteOn and source.noteOff functions to play and pause the audio. Working this way, I AM able to get the EQ visualization going. BUT, you have to wait until the mp3 file completely loads to start playing, which won't work in our situation.


The other way to do this is to have an <audio> element already on the page, and you get the audio data from that using:

source = myAudioContext.createMediaElementSource(document.querySelector('audio'));

You then use the audio tag's play and pause functions. This solves the loading problem as it allows the media to be played immediately once the page loads... BUT, EQ visualization is gone.


Both methods show the EQ when testing on Chrome (WIN), so there seems to be something specific with iOS/iPhone that isn't allowing me to get the data from an <audio> tag, but will allow me to get it if I load the mp3 file on demand.

...

Any ideas out there?

Upvotes: 3

Views: 1165

Answers (1)

Related Questions