Reputation: 192
I have a command to stream video from Raspberry Pi camera to my server by RTP protocol.
libcamera-vid -o - -t 0 -g 30 --width 640 --height 480 | ffmpeg -re -f h264 -i pipe:0 -vcodec copy -strict experimental -f rtp rtp://SERVER
I want to extract .png image (to be sure that it's uncompressed and current image) from video by request from my server, but I don't know the proper way to do this without big CPU overhead.
Use OpenCV VideoCapture and VideoWriter. Grab the frame from camera, write it to RTSP using GStreamer, then check should you send it to server. It will cause overhead and it will re-encode the frames.
Use FFmpeg libraries (libav) directly and write a C program for streaming, then make TCP client to recive commands. But is it possible to get the image when libavcodec stream my video?
Capture images every n second using FFmpeg, then send them when necessary. Will cause overhead and image may not required now.
-r 1/60 output.png
Patch system to capute video and pictures at same time.
Upvotes: 0
Views: 1420