Irina Rapoport
Irina Rapoport

Reputation: 1692

How do I stream an audio file to nodejs while it's still being recorded?

I am using MediaStream Recording API to record audio in the browser, like this (courtesy https://github.com/bryanjenningz/record-audio):


      const recordAudio = () =>
        new Promise(async resolve => {

          // This wants to be secure. It will throw unless served from https:// or localhost.
          const stream = await navigator.mediaDevices.getUserMedia({ audio: true });

          const mediaRecorder = new MediaRecorder(stream);
          let audioChunks = [];

          mediaRecorder.addEventListener('dataavailable', event => {
            audioChunks.push(event.data);
            console.log("Got audioChunk!!", event.data.size, event.data.type);
      //      mediaRecorder.requestData()
          });

          const start = () => {
            audioChunks = [];
            mediaRecorder.start(1000); // milliseconds per recorded chunk
          };

          const stop = () =>
            new Promise(resolve => {
              mediaRecorder.addEventListener('stop', () => {
                const audioBlob = new Blob(audioChunks, { type: 'audio/mpeg' });
                const audioUrl = URL.createObjectURL(audioBlob);
                const audio = new Audio(audioUrl);
                const play = () => audio.play();
                resolve({ audioChunks, audioBlob, audioUrl, play });
              });

              mediaRecorder.stop();
            });

          resolve({ start, stop });
        });

I would like to modify this code to start streaming to nodejs while it's still recording. I understand the header won't be complete until it finished the recording. I can either account for that on nodejs, or perhaps I can live with invalid headers, because I'll be feeding this into ffmpeg on nodejs anyway. How do I do this?

Upvotes: 0

Views: 1637

Answers (1)

code_monk
code_monk

Reputation: 10128

The trick is when you start your recorder, start it like this mediaRecorder.start(timeSlice), where timeSlice is the number of milliseconds the browser waits before emitting a dataavailable event with a blob of data.

Then, in your event handler for dataavailable you call the server:

    mediaRecorder.addEventListener('dataavailable', event => {

        myHTTPLibrary.post(event.data);

    });

That's the general solution. It's not possible to insert an example here, because a code sandbox can't ask you to use your webcam, but I've created one here. It simply sends your data to Request Bin, where you can watch the data stream in.

There are some other things you'll need to think about if you want to stitch the video or audio back together. The blog post touches on that.

Upvotes: 1

Related Questions