IceCreamVan
IceCreamVan

Reputation: 314

FFmpeg live streaming for media source extensions(MSE)

I try implement video live streaming from RTSP stream to webpage with media source extensions(MSE) with using FFmpeg

Expected system diagram.

enter image description here

I know that this task can realize with HLS or WebRTC, but HLS have large delay and WebRTC very hard to implementation.

I want catch RTSP stream with FFMPEG split it to ISO BMMF(ISO/IEC 14496-12) chuncks in "live mode" and send it to my web server by TCP in which i restream this chunks to webpage by websocket. In webpage i append chunck to buffer sourceBuffer.appendBuffer(new Uint8Array(chunck)) and video play in streaming mode.

Problem in first step with ffmpeg i can easy split RTSP stream to segments with this

ffmpeg -i test.mp4 -map 0 -c copy -f segment -segment_time 2 -reset_timestamps 1 output_%03d.mp4

but i cant redirect output to tcp://127.0.0.1 or pipe:1, if i correctly understood segment not work with pipes. For example i can easy send video frames in jpg by TCP with image2 catch ff d9 bytes in TCP stream and split stream to jpg images.

ffmpeg -i rtsp://127.0.0.1:8554 -f image2pipe tcp://127.0.0.1:7400

How i can split RTSP stream to ISO BMMF chunks for sending to webpage for playing with media source extensions? Or other way to prepare RTSP stream with FFmpeg for playing in MSE. Maybe i not correctly understood how working MSE and how prepare video for playing.

Upvotes: 0

Views: 1412

Answers (1)

Brad
Brad

Reputation: 163240

...in which i restream this chunks to webpage by websocket.

You don't need Web Sockets. It's easier than that. In fact, you don't need MediaSource Extensions either.

Your server should stream the data from FFmpeg over a regular HTTP response. Then, you can do something like this in your web page:

<video src="https://stream.example.com/output-from-ffmpeg" preload="none"></video>

How i can split RTSP stream to ISO BMMF chunks for sending to webpage for playing with media source extensions?

You need to implement a thin application server-side to receive the data piped from FFmpeg's STDOUT, and then relay it to the client. I've found it easier to use WebM/Matroska for this, because you won't have to deal with the moov atom and what not.

Upvotes: -1

Related Questions