Arvind
Arvind

Reputation: 730

Gstreamer appsrc to stream OpenGL framebuffer

I'm trying to stream OpenGL framebuffer from my OSX App by using GStreamer-1.0 via hlssink. My pipeline is appsrc->decodebin->videoconvert->x264enc->mpegtsmux->hlssink. The problem is, the feed appears like this and there is at least a 10 second delay.

screen capture on VLC

If you see the picture, there is a wrapped up image of my desktop. I just started learning GStreamer and don't have much idea about the encoding/muxing part. One more thing that I noticed is even without the encoding and muxing part i.e.,appsrc->videoconvert->osxsink the feed appears just like this.

What might be the problem? How do I get a clear feed from framebuffer?

How am I supposed to approach this problem in order to stream it in realtime or at least with minimal delay?

Should I use tcpserversink instead of hlssink for decreasing the latency?

I'm integrating GStreamer with my OSX app which produces the source buffer for appsrc. My end goal is to stream the realtime feed over http / tcp.

I'm working on this for two weeks now, there is a fair chance I have missed something very basic, so feel free to comment your opinions. Do let me know if anyone needs more info or the source code.

EDIT:

This is the caps that I'm setting for appsrc.

caps = gst_caps_new_simple ("video/x-raw",
                            "width", G_TYPE_INT, 1280,
                            "height", G_TYPE_INT, 800,
                            "format", G_TYPE_STRING, "RGB16", NULL);

The type of data fed is raw data from the framebuffer. This is a screen casting app for mac.

Is hlssink the right choice if I do realtime screen casting? Should I try tcpserversink?

Upvotes: 0

Views: 1577

Answers (2)

Muzammil360
Muzammil360

Reputation: 29

Im order to improve latency, there are multiple avenues of improvement.

  1. Streaming technology
  2. Encoding tech

Streaming technology

Each one has its pros and cons so you need to optimize for your application.

If you use hls or dash, expect 10-30 second of latency. This is because this was designed for livestreaming applications such as sports which don't necessarily need very low latency. It works by recoding video stream in small chunks (say 10sec) and then transmitting them over http. This makes rewinding much more efficient. If you reduce your chunk length on server side and reduce jitter chunks on client side, this will improve latency. But not sub second. Newer technologies such as fragmented hls/dash can do better but not all clients will support them.

If you use rtmp, this will significantly improve latency over hls/dash. But you need to generally open custom port (1935). Also it's an old piece of tech so newer clients might be dropping support for this.

Webrtc is a shiny new tech for sub second latency. It has advantages if you want to do 1:n streaming or add custom data on top of your stream. It's natively supported by html5 so that's a plus as well. One down side is that it is relatively complex and will need more development effort.

encoding

Other latency comes from encoding. This is really an optimization problem b/w time, compute power and Network bandwidth. Attempting to reduce one will increase other.

Most encoders will allow low latency option at the expense of bandwidth. More complex encodings (e.g using h265 over h264) will require higher compute power to maintain same latency but improved bandwidth. You need to optimize based on your end application.

Upvotes: 0

Florian Zwoch
Florian Zwoch

Reputation: 7383

For latency consider using the tune=zerolatency option for the x264enc element.

For the appsrc we need to know what kind of data you feed into the pipeline and what caps you set there. Most likely you don't set them to match each other so gstreamer will misinterpret the data representation.

Upvotes: 1

Related Questions