Reputation: 729
I am making a robot that will have a webcam on it to provide some simple object detection. For right now, I would like to simply stream the video to a webpage hosted on the robot and be able to view it from another device. I have written a simple test script in Python ( I will eventually move to C++, my language of choice) which can get a stream from my webcam, and then do whatever I need with it from there. The problem then, is that I can't write the video to a file while the app is running, it only writes the file after I quit the script. I already have a webserver running, and I can write the basic code in HTML to host a video from a file as well, and all of that works.
To summarize: Is openCV2 in Python and/or C++ capable of livestreaming video using only openCV? If not, what library would you recommend that I try to take a CV capture object or Mat object and write it to a stream that I can then put on a webpage? In HTML, is the tag a good idea to stream video with?
Thank you very much for the advice, I can use all the pointers* I can get!
If you need something clarified/code posted/explanations further than what I have given, please ask and I will do so!
Upvotes: 22
Views: 35121
Reputation: 11228
I may be a little late, but as I didn't find a completely updated solution for C++ and mjpeg in StackOverflow, thought about writing a new answer.
There are now some good and simple libraries for the task in C++ (c++ mjpg streaming to html)
https://github.com/nadjieb/cpp-mjpeg-streamer
https://github.com/jacksonliam/mjpg-streamer
https://github.com/codewithpassion/mjpg-streamer/tree/master/mjpg-streamer
I found the first one to be very simple. You need CMake, and make installed in the system.
git clone https://github.com/nadjieb/cpp-mjpeg-streamer.git;
cd cpp-mjpeg-streamer;
mkdir build && cd build;
cmake ../;
make;
sudo make install;
Now, write the streamer:
mjpeg_server.cc
#include <opencv2/opencv.hpp>
#include <nadjieb/mjpeg_streamer.hpp>
// for convenience
using MJPEGStreamer = nadjieb::MJPEGStreamer;
int main()
{
cv::VideoCapture cap;
cap.open("demo.mp4");
if (!cap.isOpened())
{
std::cerr << "VideoCapture not opened\n";
exit(EXIT_FAILURE);
}
std::vector<int> params = {cv::IMWRITE_JPEG_QUALITY, 90};
MJPEGStreamer streamer;
// By default 1 worker is used for streaming
// if you want to use 4 workers:
// streamer.start(8080, 4);
streamer.start(8000);
// Visit /shutdown or another defined target to stop the loop and graceful shutdown
while (streamer.isAlive())
{
cv::Mat frame;
cap >> frame;
if (frame.empty())
{
std::cerr << "frame not grabbed\n";
//continue;
exit(EXIT_FAILURE);
}
// http://localhost:8080/bgr
std::vector<uchar> buff_bgr;
cv::imencode(".jpg", frame, buff_bgr, params);
streamer.publish("/bgr", std::string(buff_bgr.begin(), buff_bgr.end()));
cv::Mat hsv;
cv::cvtColor(frame, hsv, cv::COLOR_BGR2HSV);
// http://localhost:8080/hsv
std::vector<uchar> buff_hsv;
cv::imencode(".jpg", hsv, buff_hsv, params);
streamer.publish("/hsv", std::string(buff_hsv.begin(), buff_hsv.end()));
// std::cout<< "published" << std::endl;
}
streamer.stop();
}
Write the CMakeLists.txt
cmake_minimum_required(VERSION 3.1)
project(mjpeg_streamer CXX)
find_package(OpenCV 4.2 REQUIRED)
find_package(nadjieb_mjpeg_streamer REQUIRED)
include_directories(${OpenCV_INCLUDE_DIRS})
add_executable(stream_test
"mjpeg_server.cc")
target_compile_features(stream_test PRIVATE cxx_std_11)
target_link_libraries(stream_test PRIVATE nadjieb_mjpeg_streamer::nadjieb_mjpeg_streamer
${OpenCV_LIBS})
| --- mjpeg_server.cc
| --- CMakeLists.txt
| --- ...
| --- build
| --- demo.mp4
| --- ...
Now, we can build the streamer.
mkdir build && cd build;
cmake ../;
make;
./stream_test
Now, if you go to "http://ip_address:port/bgr"
or, "http://ip_address:port/hsv"
you should be able to see the stream. In my case, ip = 192.168.1.7 / localhost, port = 8000.
If you want to grab the stream with another server,
index.html
<html>
<body>
<img src="http://localhost:8000/bgr">
<img src="http://localhost:8000/hsv">
</body>
</html>
serve.py
import http.server
import socketserver
class MyHttpRequestHandler(http.server.SimpleHTTPRequestHandler):
def do_GET(self):
if self.path == '/':
self.path = 'index.html'
return http.server.SimpleHTTPRequestHandler.do_GET(self)
# Create an object of the above class
handler_object = MyHttpRequestHandler
PORT = 8080
my_server = socketserver.TCPServer(("", PORT), handler_object)
# Star the server
my_server.serve_forever()
python3 serve.py
Finally, even though it's extremely simple, it's not secure.
Upvotes: 0
Reputation: 976
You seem to be under lab conditions, so there is a simplistic, yet usable solution, just stream PNG's in Base64 using Websockets. On the client side (web browser) you just receive the base64 images and directly load them into the src
of an <img>
. It works for lab scenarios very well, albeit slow.
Upvotes: 3
Reputation: 31
The issue of streaming frames out of OpenCV and Python has been addressed in the following thread: Pipe raw OpenCV images to FFmpeg
This didn't work for me, but they claim it did for them.
The reason for it not working in my case seems to be that for some output frames additional bytes were added or lost, somewhere between the output to stdout in capture.py and the input to FFMPEG. Therefore, the number of bytes doesn't correspond to the number of frames. I am not sure why this is the case. I used Windows 7.
I will be curious to hear what is your experience if you try this. I also tried a modified version of capture.py using cv2, and failed for the same reasons.
Upvotes: 3
Reputation: 3213
So basically you have to use OpenCV capture the frames and pack them into specific formats that fit the streaming protocol, then from your server use HTML5 to put it on the page. You may need to use VLC or FFMepg to pack your cv::Mat. Hope this will be helpful.
Upvotes: 1