Sergiy
Sergiy

Reputation: 1069

Can OpenCV decode H264 - MPEG-4 AVC (part 10)

I am trying to use OpenCV (python bindings) to connect to a UDP multicast and recover individual received frames for post-processing.

I can connect to my multicast via VLC, and VLC displays the broadcast with no issues at all. VLC reports that the codec it uses for decoding is H264 - MPEG-4 AVC (part 10).

When I try to decode using OpenCV, I do see my video stream, but many frames appear fragmented. The frames appear as if the last line of pixels just got repeated to fill in the rest of the image (sometimes 75% or more of the whole image). OpenCV reports decoding errors (error while decoding MB ...., bytestream ).

Is there any way to force OpenCV to use whatever codec VLC is using? I tried to specify the specific codec to use in my code for OpenCV but it seems to have no effect.

The code I am using is below:

import numpy as np
import cv2
from cv2 import cv

cap = cv2.VideoCapture()
cap.set(cv.CV_CAP_PROP_FOURCC, cv.CV_FOURCC('A','V','C','1')) 
cwi=cap.open(r'myurlandport')

counter = 0

while(cap.isOpened()):

    ret, frame = cap.read()


    counter += 1

    if counter % 30 == 0:
        cv2.imshow('frame', frame)
        if cv2.waitKey(1) & 0xFF == ord('q'):
            break

cap.release()
cv2.destroyAllWindows()

Upvotes: 5

Views: 6446

Answers (1)

Kaan Aydin
Kaan Aydin

Reputation: 11

Last time I checked (OpenCV 2.4.9) ffmpeg build used in OpenCV did not utilize UDP protocol. It doesn't buffer received packets for later use. More info here: http://code.opencv.org/issues/2235

EDIT: In order to force TCP mode, edit opencv\sources\modules\highgui\src\cap_ffmpeg_impl.hpp line 538

int err=avformat_open_input(&ic, _filename, NULL, NULL);

adding tcp:

 AVDictionary *d = NULL;
 av_dict_set(&d, "rtsp_transport", "tcp", 0);
 int err=avformat_open_input(&ic, _filename, NULL, &d);

Upvotes: 1

Related Questions