Reputation: 163
I'm trying to screen blend two libvpx-vp9 webm files, so that the blend comes out looking correct in FFMPEG. The example below takes two rgba png input files, loops them for a couple of seconds into libvpx-vp9 webm files with the pixel format yuva420p. It then tries to blend them using FFMPEG. I then output frames of these to visualise how it looks here in this Stack Overflow post.
I have these two input rgba pngs (circle and Pikachu)
I create two libvpx-vp9 webm files from them like this:-
ffmpeg -loop 1 -i circle_50_rgba.png -c:v libvpx-vp9 -t 2 -pix_fmt yuva420p circle_libvpx-vp9_yuva420p.webm
ffmpeg -loop 1 -i pikachu_rgba.png -c:v libvpx-vp9 -t 2 -pix_fmt yuva420p pikachu_libvpx-vp9_yuva420p.webm
I then try and do a blend of these two libvpx-vp9 webm files like this:-
ffmpeg -y -c:v libvpx-vp9 -i circle_libvpx-vp9_yuva420p.webm -c:v libvpx-vp9 -i pikachu_libvpx-vp9_yuva420p.webm -filter_complex "[1:v][0:v]blend=all_mode=screen" pikachu_reverse_all_mode_screened_onto_circle_both_yuva420p.webm
and extract a frame from that like this
ffmpeg -c:v libvpx-vp9 -i pikachu_reverse_all_mode_screened_onto_circle_both_yuva420p.webm -frames:v 1 pikachu_reverse_all_mode_screened_onto_circle_from_yuva420p.png
If I do this without all_mode, like this
ffmpeg -y -c:v libvpx-vp9 -i circle_libvpx-vp9_yuva420p.webm -c:v libvpx-vp9 -i pikachu_libvpx-vp9_yuva420p.webm -filter_complex "[1:v][0:v]blend=screen" pikachu_reverse_screened_onto_circle_both_yuva420p.webm
and then extract the png so we can visualise it, like this:-
ffmpeg -c:v libvpx-vp9 -i pikachu_reverse_screened_onto_circle_both_yuva420p.webm -frames:v 1 pikachu_reverse_screened_onto_circle_from_yuva420p.png
which is also incorrect because the white part of the circle should be completely white in the screen blend. We shouldn't see a faint yellow outline of Pikachu inside the white part.
Here is the full log of this is like this:-
ffmpeg -y -c:v libvpx-vp9 -i circle_libvpx-vp9_yuva420p.webm -c:v libvpx-vp9 -i pikachu_libvpx-vp9_yuva420p.webm -filter_complex "[1:v][0:v]blend=screen" pikachu_reverse_screened_onto_circle_both_yuva420p.webm
ffmpeg version 4.2.7-0ubuntu0.1 Copyright (c) 2000-2022 the FFmpeg developers
built with gcc 9 (Ubuntu 9.4.0-1ubuntu1~20.04.1)
configuration: --prefix=/usr --extra-version=0ubuntu0.1 --toolchain=hardened --libdir=/usr/lib/x86_64-linux-gnu --incdir=/usr/include/x86_64-linux-gnu --arch=amd64 --enable-gpl --disable-stripping --enable-avresample --disable-filter=resample --enable-avisynth --enable-gnutls --enable-ladspa --enable-libaom --enable-libass --enable-libbluray --enable-libbs2b --enable-libcaca --enable-libcdio --enable-libcodec2 --enable-libflite --enable-libfontconfig --enable-libfreetype --enable-libfribidi --enable-libgme --enable-libgsm --enable-libjack --enable-libmp3lame --enable-libmysofa --enable-libopenjpeg --enable-libopenmpt --enable-libopus --enable-libpulse --enable-librsvg --enable-librubberband --enable-libshine --enable-libsnappy --enable-libsoxr --enable-libspeex --enable-libssh --enable-libtheora --enable-libtwolame --enable-libvidstab --enable-libvorbis --enable-libvpx --enable-libwavpack --enable-libwebp --enable-libx265 --enable-libxml2 --enable-libxvid --enable-libzmq --enable-libzvbi --enable-lv2 --enable-omx --enable-openal --enable-opencl --enable-opengl --enable-sdl2 --enable-libdc1394 --enable-libdrm --enable-libiec61883 --enable-nvenc --enable-chromaprint --enable-frei0r --enable-libx264 --enable-shared
libavutil 56. 31.100 / 56. 31.100
libavcodec 58. 54.100 / 58. 54.100
libavformat 58. 29.100 / 58. 29.100
libavdevice 58. 8.100 / 58. 8.100
libavfilter 7. 57.100 / 7. 57.100
libavresample 4. 0. 0 / 4. 0. 0
libswscale 5. 5.100 / 5. 5.100
libswresample 3. 5.100 / 3. 5.100
libpostproc 55. 5.100 / 55. 5.100
[libvpx-vp9 @ 0x55d5b1f34680] v1.8.2
Last message repeated 1 times
Input #0, matroska,webm, from 'circle_libvpx-vp9_yuva420p.webm':
Metadata:
ENCODER : Lavf58.29.100
Duration: 00:00:02.00, start: 0.000000, bitrate: 19 kb/s
Stream #0:0: Video: vp9 (Profile 0), yuva420p(tv), 50x50, SAR 1:1 DAR 1:1, 25 fps, 25 tbr, 1k tbn, 1k tbc (default)
Metadata:
alpha_mode : 1
ENCODER : Lavc58.54.100 libvpx-vp9
DURATION : 00:00:02.000000000
[libvpx-vp9 @ 0x55d5b1f854c0] v1.8.2
Last message repeated 1 times
Input #1, matroska,webm, from 'pikachu_libvpx-vp9_yuva420p.webm':
Metadata:
ENCODER : Lavf58.29.100
Duration: 00:00:02.00, start: 0.000000, bitrate: 29 kb/s
Stream #1:0: Video: vp9 (Profile 0), yuva420p(tv), 50x50, SAR 1:1 DAR 1:1, 25 fps, 25 tbr, 1k tbn, 1k tbc (default)
Metadata:
alpha_mode : 1
ENCODER : Lavc58.54.100 libvpx-vp9
DURATION : 00:00:02.000000000
[libvpx-vp9 @ 0x55d5b1f38940] v1.8.2
[libvpx-vp9 @ 0x55d5b1f49440] v1.8.2
Stream mapping:
Stream #0:0 (libvpx-vp9) -> blend:bottom
Stream #1:0 (libvpx-vp9) -> blend:top
blend -> Stream #0:0 (libvpx-vp9)
Press [q] to stop, [?] for help
[libvpx-vp9 @ 0x55d5b1f49440] v1.8.2
[libvpx-vp9 @ 0x55d5b1f38940] v1.8.2
[libvpx-vp9 @ 0x55d5b1f80c40] v1.8.2
Output #0, webm, to 'pikachu_reverse_screened_onto_circle_both_yuva420p.webm':
Metadata:
encoder : Lavf58.29.100
Stream #0:0: Video: vp9 (libvpx-vp9), yuva420p, 50x50 [SAR 1:1 DAR 1:1], q=-1--1, 200 kb/s, 25 fps, 1k tbn, 25 tbc (default)
Metadata:
encoder : Lavc58.54.100 libvpx-vp9
Side data:
cpb: bitrate max/min/avg: 0/0/0 buffer size: 0 vbv_delay: -1
frame= 50 fps=0.0 q=0.0 Lsize= 7kB time=00:00:01.96 bitrate= 29.3kbits/s speed=33.2x
video:4kB audio:0kB subtitle:0kB other streams:0kB global headers:0kB muxing overhead: 96.711426%
I also tried doing a convertion to rgba, like this:-
ffmpeg -y -c:v libvpx-vp9 -i circle_libvpx-vp9_yuva420p.webm -c:v libvpx-vp9 -i pikachu_libvpx-vp9_yuva420p.webm -filter_complex "[0:v]format=pix_fmts=rgba[zero];[1:v]format=pix_fmts=rgba[one];[one][zero]blend=screen" pikachu_reverse_screened_all_mode_onto_circle_after_rgba_conversion_webm.webm
However the result of this also comes out with yellow inside the white circle, which should be white
I was wondering what I need to do so that the blend of these two webm libvpx-vp9 video files looks correct, like it does above.
note: I need to retain the alpha channels, because sometimes assets have transparent alpha channels. In the examples above the assets happen to have opaque alpha channels.
Upvotes: 0
Views: 318
Reputation: 163
Secret is to use gbrp to convert the file and also all_mode, like this:-
ffmpeg -y -c:v libvpx-vp9 -i circle_libvpx-vp9_yuva420p.webm -c:v libvpx-vp9 -i pikachu_libvpx-vp9_yuva420p.webm -filter_complex "[0:v]format=pix_fmts=gbrp[zero];[1:v]format=pix_fmts=gbrp[one];[one][zero]blend=all_mode=screen" pikachu_reverse_screened_all_mode_onto_circle_after_gbrp_conversion_webm.webm
if you then extract the frame from that like this:-
fmpeg -c:v libvpx-vp9 -i pikachu_reverse_screened_all_mode_onto_circle_after_gbrp_conversion_webm.webm -frames:v 1 pikachu_reverse_screened_all_mode_onto_circle_after_gbrp_conversion_png.png
you'll get it looking like this:-
Upvotes: 0