Reputation: 103
I am trying to receive H264 frames from a USB webcamera connected to my Raspberry PI
Using the RPi Camera Module I can run the following command to get H264 data outputted in stdin: raspivid -t 0 -w 640 -h 320 -fps 15 -o -
with close to zero latency
Is there an equivalent function to do this with a USB camera? I have two USB cameras I would like to do this with.
Using ffprobe /dev/videoX
I get the following output: (shorted down to the important details):
$ ffprobe /dev/video0
...
Input #0, video4linux2,v4l2, from '/dev/video0':
Duration: N/A, start: 18876.273861, bitrate: 147456 kb/s
Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 1280x720, 147456 kb/s, 10 fps, 10 tbr, 1000k tbn, 1000k tbc
$ ffprobe /dev/video1
...
Input #0, video4linux2,v4l2, from '/dev/video1':
Duration: N/A, start: 18980.783228, bitrate: 115200 kb/s
Stream #0:0: Video: rawvideo (YUY2 / 0x32595559), yuyv422, 800x600, 115200 kb/s, 15 fps, 15 tbr, 1000k tbn, 1000k tbc
$ ffprobe /dev/video2
...
Input #0, video4linux2,v4l2, from '/dev/video2':
Duration: N/A, start: 18998.984143, bitrate: N/A
Stream #0:0: Video: h264 (Main), yuv420p(progressive), 1920x1080, -5 kb/s, 30 fps, 30 tbr, 1000k tbn, 2000k tbc
As far as I can tell two of them are not H264, which will need to be "decoded" to H264 so I understand there is added a bit latency there. But the third one (video2) is H264 so I should be able to get data from it? I've tried to just pipe it out with CAT but it says I got invalid arguments.
I've come as far as using FFMPEG might be the only option here. Would like to use software easily available for all RPi (apt install).
Bonus question regarding H264 packets: When I stream the data from raspivid command to my decoder it works perfectly. But if I decide to drop the 10 first packets then it never initializes the decoding process and just shows a black background. Anyone know what might be missing in the first packets that I might be able to recreate in my software so I dont have to restart the stream for every newly connected user?
EDIT: Bonus Question Answer: After googling around I see that the first two frames raspivid
sends me are. So by ignoring the two first frames my decoder wont "decode" properly. So if I save those frames and send them first to all new users it works perfectly. Seems like these are used in some kind of initial process.
0x27 = 01 00111 = type 7 Sequence parameter set (B-frame)
0x28 = 01 01000 = type 8 Picture parameter set (B-frame)
Upvotes: 3
Views: 8519
Reputation: 1285
First, let us get the data flow right. For the Raspi cam:
The Raspi camera is connected by CSI (Camera Serial Interface) to the Raspi. This link carries uncompressed, raw image data.
raspivid talks to the embedded GPU of the Raspi to access the image data and also asks the GPU to perform H.264 encoding, which always adds some latency (you could use raspiyuv to get the raw uncompressed data, possibly with less latency).
USB webcams typically transfer uncompressed, raw image data. But some also transfer H.264 or jpeg encoded data.
Next, the Video for Linux API version 2 was not made for shell pipes, so you can't get data out of a /dev/videoX with cat. You need some code to perform IOCTL calls to negotiate what and how to read data from the device. ffmpeg does exactly that.
Regarding your bonus question, you might try the --inline option of raspivid, which forces the stream to include PPS and SPS headers on every I-frame.
Next, outputting H.264 data from ffmpeg, using -f rawvideo looks wrong to me, since rawvideo means uncompressed video. You could instead try -f h264 to force raw H.264 videooutput format:
ffmpeg -i /dev/video2 -c copy -f h264 pipe:1
Finally, you actually want to get a H.264 stream from your USB webcam. Since the image data comes uncompressed from the camera, it first has to be encoded to H.264. The sensible option on the Raspi is to use the hardware encoder, since using a software encoder like x264 would consume too much CPU resources.
If you have an ffmpeg that was configured using --enable-mmal and/or --enable-omx-rpi, you can use ffmpeg to talk to the hardware H.264 encoder.
Otherwise, take a look at gstreamer and its omxh264enc element, eg. here. gstreamer can also talk to v4l2 devices.
Upvotes: 2