Reputation: 2175
Currently I have a setup like this.
my-app | gst-launch-1.0 -e fdsrc ! \
videoparse format=GST_VIDEO_FORMAT_BGR width=640 height=480 ! \
videoconvert ! 'video/x-raw, format=I420' ! x265enc ! h265parse ! \
matroskamux ! filesink location=my.mkv
From my-app
I am streaming raw BGR frame buffers to gst. How can I also pass presentation timestamps (PTSs) for those frames? I have somewhat full control over my-app
. I can open other pipes to gst from it.
I know I have the option to use gstreamer C/C++ API or write a gstreamer plugin, but I was trying to avoid this.
Upvotes: 1
Views: 2779
Reputation: 7383
I guess you can set a framerate
for the videoparse
element. You can also try do-timestamp=true
for the fdsrc
- maybe it requires a combination of both.
If you have the PTS in my-app
you would probably need to wrap buffers and PTS in a real GstBuffer
and use gdppay
and gdpdepay
as payload between the link.
For example if your my-app
would dump the images in the following format:
https://github.com/GStreamer/gstreamer/blob/master/docs/random/gdp
(not sure how recent this info document is)
You could receive the data with the following pipeline:
fdsrc ! gdpdepay ! videoconvert ! ..
No need for resolution and format either as it is part of the protocol too. And you will have PTS as well if set.
If you can use GStreamer lib in my-app
you could some soome pipeline like this:
appsrc ! gdppay ! fakesink dump=true
And you would push your image buffers with PTS to the appsink.
See https://github.com/GStreamer/gst-plugins-bad/tree/master/gst/gdp for some examples how gdp
is used as a protocol.
Upvotes: 2