Valdir
Valdir

Reputation: 515

Web Audio Api: Proper way to play data chunks from a nodejs server via socket

I'm using the following code to decode audio chunks from nodejs's socket

window.AudioContext = window.AudioContext || window.webkitAudioContext;
var context = new AudioContext();
var delayTime = 0;
var init = 0;
var audioStack = [];
var nextTime = 0;

client.on('stream', function(stream, meta){
    stream.on('data', function(data) {
        context.decodeAudioData(data, function(buffer) {
            audioStack.push(buffer);
            if ((init!=0) || (audioStack.length > 10)) { // make sure we put at least 10 chunks in the buffer before starting
                init++;
                scheduleBuffers();
            }
        }, function(err) {
            console.log("err(decodeAudioData): "+err);
        });
    });
});

function scheduleBuffers() {
    while ( audioStack.length) {
        var buffer = audioStack.shift();
        var source    = context.createBufferSource();
        source.buffer = buffer;
        source.connect(context.destination);
        if (nextTime == 0)
            nextTime = context.currentTime + 0.05;  /// add 50ms latency to work well across systems - tune this if you like
        source.start(nextTime);
        nextTime+=source.buffer.duration; // Make the next buffer wait the length of the last buffer before being played
    };
}

But it has some gaps/glitches between audio chunks that I'm unable to figure out.

I've also read that with MediaSource it's possible to do the same and having the timing handled by the player instead of doing it manually. Can someone provide an example of handling mp3 data?

Moreover, which is the proper way to handle live streaming with web audio API? I've already read almost all questions os SO about this subject and none of them seem to work without glitches. Any ideas?

Upvotes: 8

Views: 3218

Answers (2)

totteire
totteire

Reputation: 97

yes @Keyne is right,

const mediaSource = new MediaSource()
const sourceBuffer = mediaSource.addSourceBuffer('audio/mpeg')
player.src = URL.createObjectURL(mediaSource)
sourceBuffer.appendBuffer(chunk) // Repeat this for each chunk as ArrayBuffer
player.play()

But do this only if you don't care about IOS support 🤔 (https://developer.mozilla.org/en-US/docs/Web/API/MediaSource#Browser_compatibility)

Otherwise please let me know how you do it !

Upvotes: 0

Keyne Viana
Keyne Viana

Reputation: 6202

You can take this code as an example: https://github.com/kmoskwiak/node-tcp-streaming-server

It basically uses media source extensions. All you need to do is to change from video to audio

buffer = mediaSource.addSourceBuffer('audio/mpeg');

Upvotes: 4

Related Questions