Reputation: 530
I have a program that spits out an H.264 raw stream (namely, screenrecord
on Android). I'm using FFmpeg to add a PTS (presentation time stamp) on the frames as follows:
$ my-program | ffmpeg -i - -filter:v setpts='(RTCTIME - RTCSTART) / (TB * 1000000)' out.mp4
This filter computes the current time, and puts it as the PTS.
The trouble is that my-program
does not produce any output if there isn't any change in the video. Since FFmpeg seems to wait for a bunch of frames before putting them through the setpts
filter, the computed PTS won't be correct. In particular, the last frame of a sequence will be timestamped when the next sequence starts.
Is there a way (with FFmpeg or otherwise) to add the current time as PTS to H.264 raw frames, where "current time" is when receiving the frame, rather than outputting it?
Note: The problem is not from buffering from the pipe.
Upvotes: 1
Views: 7544
Reputation: 92928
You can assign the timestamps earlier, letting setpts simply normalize it to start from 0.
my-program | ffmpeg -use_wallclock_as_timestamps 1 -i - -filter:v setpts='PTS-STARTPTS' out.mp4
-use_wallclock_as_timestamps
is assigned by the libavformat framework when receiving the packet.
Upvotes: 2