Reputation: 17325
I'm having some issue while implementing a gstreamer plugin to play RTP video on Android. I have the following code (which works correctly):
full_pipeline_description = g_strdup_printf("playbin3 uri=%s", uri);
gub_log_pipeline(pipeline, "Using pipeline: %s", full_pipeline_description);
pipeline->pipeline = gst_parse_launch(full_pipeline_description, &err);
g_free(full_pipeline_description);
if (err) {
gub_log_pipeline(pipeline, "Failed to create pipeline: %s", err->message);
return;
}
vsink = gst_parse_bin_from_description(gub_get_video_branch_description(), TRUE, NULL);
gub_log_pipeline(pipeline, "Using video sink: %s", gub_get_video_branch_description());
g_object_set(pipeline->pipeline, "video-sink", vsink, NULL);
g_object_set(pipeline->pipeline, "flags", 0x0003, NULL);
bus = gst_pipeline_get_bus(GST_PIPELINE(pipeline->pipeline));
gst_bus_add_signal_watch(bus);
gst_object_unref(bus);
g_signal_connect(bus, "message", G_CALLBACK(message_received), pipeline);
if (vsink) {
// Plant a pad probe to answer context queries
GstElement *sink;
sink = gst_bin_get_by_name(GST_BIN(vsink), "sink");
if (sink) {
GstPad *pad = gst_element_get_static_pad(sink, "sink");
if (pad) {
gulong id = gst_pad_add_probe(pad, GST_PAD_PROBE_TYPE_QUERY_DOWNSTREAM, pad_probe, pipeline, NULL);
gst_object_unref(pad);
}
gst_object_unref(sink);
}
}
And using the same code with another pipeline (based on udpsrc instead of playbin3) does not. The pipeline I use in this case is:
udpsrc port=53512 ! application/x-rtp,media=video,clock-rate=90000,encoding-name=H264,payload=96 ! rtph264depay ! decodebin3 ! glupload ! glcolorconvert ! video/x-raw(memory:GLMemory),format=RGBA,texture-target=2D ! fakesink sync=0 qos=1 name=sink
The code is as following:
full_pipeline_description = g_strdup_printf("%s", pipeline_cmd);
gub_log_pipeline(pipeline, "Using pipeline: %s", full_pipeline_description);
pipeline->pipeline = gst_parse_launch(full_pipeline_description, &err);
g_free(full_pipeline_description);
if (err) {
gub_log_pipeline(pipeline, "Failed to create pipeline: %s", err->message);
return;
}
vsink = gst_parse_bin_from_description(gub_get_video_branch_description(), TRUE, NULL);
gub_log_pipeline(pipeline, "Using video sink: %s", gub_get_video_branch_description());
g_object_set(pipeline->pipeline, "sink", vsink, NULL);
g_object_set(pipeline->pipeline, "flags", 0x0003, NULL);
bus = gst_pipeline_get_bus(GST_PIPELINE(pipeline->pipeline));
gst_bus_add_signal_watch(bus);
gst_object_unref(bus);
g_signal_connect(bus, "message", G_CALLBACK(message_received), pipeline);
// Plant a pad probe to answer context queries
GstElement *sink;
sink = gst_bin_get_by_name(GST_BIN(vsink), "sink");
if (sink) {
GstPad *pad = gst_element_get_static_pad(sink, "sink");
if (pad) {
gulong id = gst_pad_add_probe(pad, GST_PAD_PROBE_TYPE_QUERY_DOWNSTREAM, pad_probe, pipeline, NULL);
gst_object_unref(pad);
}
gst_object_unref(sink);
}
Basically, in this case I just see blanking window (with different colors). The only difference in execution is that the pad_probe is called when using playbin3 but not when using udpsrc. This is the only difference that I can see adding some logs. I would like to understand why this callback is not called when using udpsrc and if I'm missimg something or using it wrong.
I'm facing the same issues both using gstreamer-1.14.4 and 1.16.2 version. Any hints are more than welcome.
Upvotes: 1
Views: 428
Reputation: 17325
Finally after some investigations and based on this thread Gstreamer devel I found the issue root cause. Basically as I suspected the callback for pad probe was not called when using udpsrc it only worked when using playbin3. As consequence no graphics context was provided and the video was not reproduced correctly. To solve this issue I had to add logic to handle messages on the bus to answer correctly GST_MESSAGE_NEED_CONTEXT request. For this, first you have to connect a callback to handle bus messages as following:
g_signal_connect(bus, "message", G_CALLBACK(message_received), pipeline);
Then in the message_received function, I added the following code.
static void message_received(GstBus *bus, GstMessage *message, GUBPipeline *pipeline) {
switch (GST_MESSAGE_TYPE(message)) {
...
case GST_MESSAGE_NEED_CONTEXT:
{
const gchar *context_type;
GstContext *context = NULL;
gst_message_parse_context_type (message, &context_type);
context = gub_provide_graphic_context(pipeline->graphic_context, context_type);
if (context)
{
gst_element_set_context (GST_ELEMENT (message->src), context);
gst_context_unref (context);
}
break;
}
...
}
With these modifications now I'm able to receive and reproduce the video correctly. The RTP video stream is simulated by using ffmpeg testsrc tool as following:
ffmpeg -f lavfi -i testsrc -vf scale=1280:960 -vcodec libx264 -profile:v baseline -pix_fmt yuv420p -f rtp rtp://YOUR_IP:PORT
Upvotes: 0
Reputation: 2733
g_object_set(pipeline->pipeline, "sink", vsink, NULL);
doesn't actually do anything; a GstPipeline doesn't have a "sink" property (unlike a playbin). Normally, it will spit out a log warning that tells exactly that.
To add your sink to the pipeline, you need to do it like you normally would do in a GStreamer application: you find the source pad(s) it needs to be connected to, or you wait until the correct source pad to link it to shows up in a "pad-added" signal (this happens for example in a decodebin
).
Upvotes: 1