codebot
codebot

Reputation: 2646

Video colour difference when publishing via ffmpeg

I'm trying to publish a video using ffmpeg. For publishing, I'm using python frame images as the input source. But when it streams, the video colours are different.

p = Popen(['ffmpeg', '-y', '-f', 'image2pipe', '-r', '30', '-i', '-', '-vcodec', 'mpeg4', '-qscale', '5', '-r', '30', '-b:a', '32k', '-ar', '44100', '-pix_fmt','rgb24', '-color_range', '3', '-c:v', 'libx264rgb', '-f', 'flv', 'rtmp://stream_ip/live/bbb'], stdin=PIPE)

enter image description here enter image description here

How can I adjust the ffmpeg command to serve the images with the original colours?

Console

ffmpeg version 4.1.8-0+deb10u1 Copyright (c) 2000-2021 the FFmpeg 
developers
built with gcc 8 (Debian 8.3.0-6)
configuration: --prefix=/usr --extra-version=0+deb10u1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
libavutil      56. 22.100 / 56. 22.100
libavcodec     58. 35.100 / 58. 35.100
libavformat    58. 20.100 / 58. 20.100
libavdevice    58.  5.100 / 58.  5.100
libavfilter     7. 40.101 /  7. 40.101
libavresample   4.  0.  0 /  4.  0.  0
libswscale      5.  3.100 /  5.  3.100
libswresample   3.  3.100 /  3.  3.100
libpostproc    55.  3.100 / 55.  3.100
Input #0, image2pipe, from 'pipe:':
Duration: N/A, bitrate: N/A
Stream #0:0: Video: mjpeg, yuvj420p(pc, bt470bg/unknown/unknown), 1152x720 [SAR 1:1 DAR 8:5], 30 fps, 30 tbr, 30 tbn, 30 tbc
Codec AVOption b (set bitrate (in bits/s)) specified for output file #0 
(rtmp://ip/live/bbb) has not been used for any stream. The most likely 
reason is either wrong type (e.g. a video option with no video streams) 
or that it is a private option of some encoder which was not actually 
used for any stream.
Stream mapping:
 Stream #0:0 -> #0:0 (mjpeg (native) -> flv1 (flv))
[swscaler @ 0x5639066a7f80] deprecated pixel format used, make sure you 
did set range correctly
Output #0, flv, to 'rtmp://ip/live/bbb':
  Metadata:
    encoder         : Lavf58.20.100
    Stream #0:0: Video: flv1 (flv) ([2][0][0][0] / 0x0002), yuv420p, 1152x720 [SAR 1:1 DAR 8:5], q=2-31, 200 kb/s, 30 fps, 1k tbn, 30 tbc
    Metadata:
       encoder         : Lavc58.35.100 flv
    Side data:
      cpb: bitrate max/min/avg: 0/0/200000 buffer size: 0 vbv_delay: -1
frame= 2917 fps= 29 q=31.0 size=    8834kB time=00:01:37.20 bitrate= 744.5kbits/s speed=0.952x  

Upvotes: 1

Views: 457

Answers (1)

Mark Setchell
Mark Setchell

Reputation: 207630

If you are reading JPEGs, PNGs, or video into OpenCV, it will hold them in memory in BGR channel ordering. If you are feeding such frames into ffmpeg you must either:

  • convert the frames to RGB first inside OpenCV with cv2.cvtColor(... cv2.COLOR_BGR2RGB) before sending to ffmpeg, or
  • tell ffmpeg that the frames are in BGR order by putting -pix_fmt bgr24 before the input specifier, i.e. before -i -

Upvotes: 2

Related Questions