Miguel Bastida
Miguel Bastida

Reputation: 83

Synchronize audios with HTML5 and Javascript

I want to join two audios into one to synchronize with HTML5 on the client side. I've seen it with Web Audio API can do many things, but I have not been able to find how.

I have the link to two audio files (.mp3, .wav ...), what I want is to synchronize these two audio files, like a voice and a song. I do not want them together one after another, want to sync.

I would do it all on the client side using HTML5, without need to use the server. Is this possible to do?

Thank you so much for your help.

Upvotes: 1

Views: 2652

Answers (1)

Nick Jillings
Nick Jillings

Reputation: 122

So I understand it, you have two audio files which you want to render together on the client. The web audio API can do this for you quite easily entirely in JavaScript. A good place to start is http://www.html5rocks.com/en/tutorials/webaudio/intro/

An example script would be

var context = new(window.AudioContext || window.webkitAudioContext) // Create an audio context

// Create an XML HTTP Request to collect your audio files
// https://developer.mozilla.org/en-US/docs/Web/API/XMLHttpRequest
var xhr1 = new XMLHttpRequest();
var xhr2 = new XMLHttpRequest();
var audio_buffer_1, audio_buffer_2;
xhr1.open("GET","your_url_to_audio_1");
xhr1.responseType = 'arraybuffer';
xhr1.onload = function() {
  // Decode the audio data
  context.decodeAudioData(request.response, function(buffer) {
    audio_buffer_1 = buffer;
  }, function(error){});
};

xhr2.open("GET","your_url_to_audio_2");
xhr2.responseType = 'arraybuffer';
xhr2.onload = function() {
  // Decode the audio data
  context.decodeAudioData(request.response, function(buffer) {
    audio_buffer_2 = buffer;
  }, function(error){});
};

xhr1.send();
xhr2.send();

These would load into global variables audio_buffer_1 and audio_buffer_2 the Web Audio API buffer nodes (https://webaudio.github.io/web-audio-api/#AudioBuffer) of your two files.

Now to create a new audio buffer, you would need to use offline audio context

// Assumes both buffers are of the same length. If not you need to modify the 2nd argument below
var offlineContext = new OfflineAudioContext(context.destination.channelCount,audio_buffer_1.duration*context.sampleRate , context.sampleRate);
var summing = offlineContext.createGain();
summing.connect(offlineContext.destination);
// Build the two buffer source nodes and attach their buffers
var buffer_1 = offlineContext.createBufferSource();
var buffer_2 = offlineContext.createBufferSource();
buffer_1.buffer = audio_buffer_1;
buffer_2.buffer = audio_buffer_2;

// Do something with the result by adding a callback
offlineContext.oncomplete = function(renderedBuffer){
  // Place code here
};

//Begin the summing
buffer_1.start(0);
buffer_2.start(0);
offlineContext.startRendering();

Once done you will receive a new buffer inside the callback function called renderedBuffer which will be the direct summation of the two buffers.

Upvotes: 3

Related Questions