Reputation: 31
i trying to create a live stream from a webcam to other server . and my problem is
the frame i'm geting is to big for the socket to handle
i'm geting this eror :
error: [Errno 10040] A message sent on a datagram socket was larger than the internal message buffer or some other network limit, or the buffer used to receive a datagram into was smaller than the datagram itself
here is my code :
import numpy as np
import cv2
import socket
import sys
import select
cap = cv2.VideoCapture(0)
address = ('localhost', 6005)
client_socket=socket.socket(socket.AF_INET, socket.SOCK_DGRAM)
while(True):
# Capture frame-by-frame
ret, frame = cap.read()
print (frame)
client_socket.connect(('120.0.0.1', 6005))
client_socket.sendto(frame, address)
i know c and i tought may be i can create a pointer to the frame and send it part by part so my question is how can i cast this method to python and if its possible
Upvotes: 1
Views: 1693
Reputation: 2161
From Ans :
Your image is too big to be sent in one UDP packet. You need to split the image data into several packets that are sent individually.
If you don't have a special reason to use UDP you could also use TCP by specifying socket.SOCK_STREAM instead of socket.SOCK_DGRAM. There you don't have to worry about packet sizes and ordering.
You can also look at Ans
Upvotes: 1