Reputation: 533
I'm trying to understand the feasibility of a live streaming solution. I want to grab WebRTC streams (audio and video), send them to a server and transform them in chunks to send to a html5 video tag or a DASH player using WebM container (VP8 and Opus codecs).
I also looked into ffmpeg, ffserver and gstreamer but...
My question is how to feed the WebRTC streams (live) and transform them in HTTP chunks (live DASH compatible)?
Anyone achieved something like this?
Upvotes: 9
Views: 1217
Reputation: 494
This can be achieved in Node.js according to the following:
Create a WebRTC connection with the server. This is basically a handshake that creates RTCPeerConnection references on the server and client
Pipe the getUserMedia MediaStream to the WebRTC connection. (this needs to be done track by track using addTrack as RTCPeerConnection.addStream is deprecated)
Listeners on the server push the data into a stream interface, which is piped to fluent-ffmpeg
fluent-ffmpeg transcodes the raw audio stream into MPEG-DASH (see ffmpeg options for dash transcoding) files.
Files (chunks & manifest) exist under a single folder that is available to the client
The client uses either ShakaPlayer or some other dash streamer and provides it with the basepath to the chunks and manifest.
Useful examples of how WebRTC can be used (including transcoding to mp4): https://github.com/node-webrtc/node-webrtc-examples
Upvotes: 0