Reputation: 65
I'm currently developing an application which will enable visualizing images from different sources (mostly IP cameras) in browser (in a HTML5 video element). The UI will allow for matrix view so, normally 16 or more cameras will be displayed at the same time. From cameras I get MJPEG streams or JPEG images (which I "convert" to MJPEG streams). So, for a camera, I have an MJPEG stream which I set as input for ffmpeg. I instruct ffmpeg to convert this to MP4 & H.264, and expose the output as a tcp stream, like this:
ffmpeg -f mjpeg -i "http://localhost/video.mjpg" -f mp4 -vcodec libx264 "tcp://127.0.0.1:5001?listen"
This works just fine on localhost, I get the stream displayed in the web page, at best quality.
But this has to work in various network conditions. I played a bit with chrome throttling settings, and noticed that if the network speed is just a bit below the required speed (given by the current compression settings I use in ffmpeg), the things start to go wrong: from stream start being delayed (so, no longer a live stream), up to complete freeze of 'live' image in browser.
What I need is an "adaptive" way to do the compression, in relation with current network speed.
My questions are:
is ffmpeg able to handle this, to adapt to network conditions - automatically reduce compression quality when speed is low; so the image in browser will be lower quality, but live (which is most important in my case)
if not, is there a way to workaround this?
is there a way to detect the network bottleneck? (and then restart ffmpeg with lower compression parameters; this is not a dynamic adaptive streaming, but better than nothing)
Thank you in advance!
Upvotes: 2
Views: 2843
Reputation: 494
Your solution do not work out of the local network. Why? because you must to use HTTP. For that the best solution is use HLS or DASH.
HLS
ffmpeg -i input.mp4 -s 640x360 -start_number 0 -hls_time 10 -hls_list_size 0 -f hls index.m3u8
To generate adaptive streams you have to create an second level index. I do not explain here becaouse it is really clear in Apple doumentation: https://developer.apple.com/library/ios/documentation/NetworkingInternet/Conceptual/StreamingMediaGuide/Introduction/Introduction.html#//apple_ref/doc/uid/TP40008332-CH1-SW1
and in the standard: https://datatracker.ietf.org/doc/html/draft-pantos-http-live-streaming-18
DASH
At the moment the FFMPEG not support Dash encoding. You can segment with FFMPEG ( [https://www.ffmpeg.org/ffmpeg-formats.html#segment_002c-stream_005fsegment_002c-ssegment][1] ) but i recomend that combine the FFMPEG and MP4Box. FFMPEG to transcode your live video and the MP4Box to segment and create the index .mpd.
MP4Box is a part of GPAC ( [http://gpac.wp.mines-telecom.fr/][2] )
An example can be (using h264) - If you need vp8 (webm, use -vcodec libvpx and -f webm or -f ts ):
ffmpeg -threads 4 -f v4l2 -i /dev/video0 -acodec libfaac -ar 44100 -ab 128k -ac 2 -vcodec libx264 -r 30 -s 1280x720 -f mp4 -y "$movie" > temp1.mp4 && MP4Box -dash 10000 -frag 1000 -rap "$movie"
Upvotes: 2