Reputation: 51
I want to build a live streaming app. My thought process:
navigator.mediaDevices.getUserMedia(constraints);
[client-streamer]Is that correct? How should I do it?
how do I broadcast data to specific room members and not to everyone? (flask) How to consistently send data from the streamer -> server -> room members. the stream is given from 1 is an object, where is the data?
any other better ideas will be great! thanks.
Upvotes: 4
Views: 6587
Reputation: 2085
Implementing a streaming platform is not trivial. Unfortunately, it is not as simple as emitting chunks received from the MediaRecorder with onndatavailable
and forwarding them to users using a WebSocket server - this is not scalable nor efficient nor reliable.
Below are some strategies you can try for different types of scenarios:
P2P: If you want to have simple peer-to-peer streaming, you can use WebRTC to achieve that with a simple socket.io server for signaling purposes.
Conference: Here things start to get more complicated. You will need a media server if you want to be somewhat scalable. One approach is to route your stream to the users using an SFU or MCU. This will take care of forwarding/processing media to different peers efficiently.
Broadcast: Here things are also non-trivial. Common WebRTC-based architectures include ingesting the WebRTC stream and forward that to an HLS server which will let your stream chunks available for clients through a CDN, or perform RTP forwarding of the WebRTC stream, convert it to RTMP using something like FFmpeg and deliver it through Youtube Live or Twitch to leverage from their infrastructure.
Be aware that the last 2 items are resource-intensive and will certainly not be cheap to maintain.
Below are some open source projects that could help you along the way:
Good luck!
Upvotes: 3
Reputation: 108776
Explaining all this is far beyond the scope of a Stack Overflow answer.
Here are a few hints:
You need to use the MediaRecorder API to capture compressed data from your gUM (getUserMedia) stream. MediaRecorder support is inconsistent between makes and models of browser. though.
It kicks a Blob into its onndatavailable
handler every so often.
They're compressed as a webm data stream.
You can push those Blobs to a server with socket.io, and the server can turn around and push them to whatever clients you want to.
Playing the webm on the clients is tricky. You may, on some makes and models of browsers, be able to feed the webm stream to the Media Source API using appendBuffer(). But some browsers cannot consume the webm streams.
These webm streams are useless to a player without all their Blob data in order. You can't just start sending a new client the Blobs of the stream when they sign in; you have to restart the MediaRecorder.
(You may be able to make it work without a MediaRecorder restart if you send the first few k bytes of the stream to each new client before sending the current Blob. Extracting those bytes is an intricate programming job involving the ebml package to parse the webm stream and extract the prologue. I have not proven this concept.)
Because getting all this to work -- originator -- server -- viewer is such a pain in the xxx neck, you may want to investigate using something like mediasoup instead. It uses WebRTC transport rather than socket.io, and works cross-platform.
Upvotes: 0