Reputation: 67
I have raw data from rtmp server with pixel format yuv420p
I use pipe to read data. But I don't know how to decode raw data to image.
command = ['ffmpeg']
command.extend(["-loglevel", "fatal", "-i", 'rtmp://localhost/live/stream', "-f", "flv", "-pix_fmt" , 'yuv420p', '-vcodec', 'h264', "-"])
self.process = subprocess.Popen(command, stderr=subprocess.PIPE ,stdout = subprocess.PIPE)
self.output = self.process.stdout
self.fs = width*height*3 // 2
while True:
data = self.output.read(self.fs)
I have try decode like this enter link description here
But result is enter image description here
Can anyone help me with this problem ?
Upvotes: 2
Views: 1580
Reputation: 207385
I am no expert on ffmpeg
, so I will defer to anybody who knows better and delete my answer if it proves incorrect.
As far as I can see, you have an RTMP stream that you want to ingest into OpenCV. OpenCV uses Numpy arrays with BGR ordering to store images - and video frames obviously, which are just lots of images one after the other. So, I would suggest you ask ffmpeg
to convert the Flash video stream to exactly what OpenCV wants:
ffmpeg <RTMP INPUT STUFF> -pix_fmt bgr24 -f rawvideo -
and then change this since it is now BGR888:
self.fs = width * height * 3
As I don't have an RTMP source available, I generated a test stream like this:
# Generate raw video stream to read into OpenCV
ffmpeg -f lavfi -i testsrc=duration=10:size=640x480:rate=30 -pixel_format rgb24 -f rawvideo -
And then I piped that into Python with:
ffmpeg -f lavfi -i testsrc=duration=10:size=640x480:rate=30 -pixel_format rgb24 -f rawvideo - | ./PlayRawVideo
The Python program PlayRawVideo
looks like this:
#!/usr/bin/env python3
import numpy as np
import cv2
import sys
# Set width and height
w, h = 640, 480
while True:
data = sys.stdin.buffer.read(w * h *3)
if len(data) == 0:
break
frame = np.frombuffer(data, dtype=np.uint8).reshape((h, w, 3))
cv2.imshow("Stream", frame)
cv2.waitKey(1)
Note that I had to use sys.stdin.buffer.read()
to get raw binary data.
Upvotes: 1