Reputation: 55
I am trying to implement a system that takes live camera stream, overlays some text and symbols (using Alpha-Channel transparency); and transmits it over RTP/UDP, as a single video (from one port). Both host and client systems have i.MX6QP on. For now, I am trying to figure out the GStreamer and pipeline system with gst-launch, and only using test patterns as video sources.
Here is a simplified diagram to show what I'm trying to achieve:
I have seen the videomixer plugin, but from what I understand, it is only used for overlaying and playing videos, not for creating 'transmittable' video streams. (I can use it with xvimagesink, but I couldn't achieve to implement the pipeline with udpsink. And I couldn't find a workaround for it.)
I haven't been able to find right tools/methods to implement the system described above. Am I right about the videomixer plugin? If so, what do you suggest me to do? Any help is appreciated, thanks in advance.
Upvotes: 0
Views: 342
Reputation: 595
If you're processing stream on I.MX6, you should take a look at the imxg2dcompositor
plugin in gstreamer-imx
gst-inspect-1.0 imxg2dcompositor
From their example, can you try something like:
gst-launch-1.0 \
imxg2dcompositor name=c background-color=0x223344 \
sink_0::xpos=0 sink_0::ypos=90 sink_0::width=160 sink_0::height=110 sink_0::zorder=55 sink_0::fill_color=0xff00ff00 sink_0::alpha=0.39 sink_0::rotation=0 \
sink_1::xpos=0 sink_1::ypos=20 sink_1::width=620 sink_1::height=380 sink_1::fill_color=0x44441133 ! \
queue2 ! "video/x-raw, width=800, height=600" ! imxipuvideotransform ! imxvpuenc_h264 ! rtph264pay ! udpsink \
videotestsrc pattern=0 ! "video/x-raw, framerate=30/1" ! c.sink_0 \
videotestsrc pattern=18 ! "video/x-raw, framerate=30/1" ! c.sink_1
I'll advise you not to miss the video encoding part which is essential when trying to setup video streaming applications (H.264 encoding plugin is imxvpuenc_h264
). This encoding step is followed by a specific RTP pay-loader.
Upvotes: 1