NewProggie
NewProggie

Reputation: 1095

Signal/Slot OpenCV Mat over different threads in Qt

I aggregate images from a hardware (PointGrey) camera and put them into opencv matrices in a dedicated (camera) thread. I want to display these images in a QWidget which is running in another (gui) thread.

If the image aggregation and the gui are running in the same thread, everything is fine memory-wise, but as soon as the camera is running in another thread, I get a memory leak, because the opencv matrices are not deleted properly.

The whole thing looks as follows:

 Thread A                         Thread B
+---------+                      +---------+
| Camera  |                      | QWidget |
+---------+                      +---------+
     |                                |
     | emit camFrame(frame);--------->|
     |                                |-> setImage(cv::Mat frame);
     |                                |

I connect the camFrame signal with the setImage slot inside a QMainWindow instance:

mCameraThread = new QThread;
mCamera->moveToThread(mCameraThread);

/* connect camera with attached thread */
connect(mCameraThread, SIGNAL(started()), mCamera, SLOT(start()));
connect(mCamera, SIGNAL(stopped()), mCameraThread, SLOT(quit()));
connect(mCamera, SIGNAL(stopped()), mCamera, SLOT(deleteLater()));
connect(mCameraThread, SIGNAL(finished()), mCameraThread, SLOT(deleteLater()));

/* connect camera with camerawidget thread */
connect(mCamera, SIGNAL(camFrame(cv::Mat)), mCameraPreviewWidget, SLOT(setImage(cv::Mat)));

/* start camera in separate thread with high priority */
mCameraThread->start();
mCameraThread->setPriority(QThread::TimeCriticalPriority);

I am not allocating new memory when sending the opencv matrix to the other thread, so I have no idea, why the memory leak is happening. I am doing:

Image rawImage;
mError = mCamera.RetrieveBuffer(&rawImage);

cv::Mat tmpFrame(rawImage.GetRows(), rawImage.GetCols(), CV_8UC1);
tmpFrame.data = rawImage.GetData();
cv::Mat actualFrame = tmpFrame.clone();
tmpFrame.release();
emit camFrame(actualFrame)

Can somebody point me in the right direction of what is possibly going wrong here? Thanks in advance

FIXED

Thanks, everybody. The example code above is fine actually. My mistake was that I accidentally started the camera twice codewise. While playing with the second thread, I forgot to remove the line

mCamera->start()

from my code. Sorry for your time because of this stupid mistake.

Upvotes: 2

Views: 2213

Answers (2)

James Harper
James Harper

Reputation: 560

OpenCV won't automatically release the resources when the data is assigned by a pointer("frame.data = rawImage.GetData();"). You might want to make a copy or manually release the Mat.

Upvotes: 1

mfuchs
mfuchs

Reputation: 2250

Unfortunately I cannot comment yet, thus an answer:

Did you register the type via Q_DECLARE_METATYPE? Because maybe Qt is not actually sending cv::Mat but instead using one of the many conversion operators of cv::Mat.

Qt 5.3.1 documentation for Q_DECLARE_METATYPE (emphasis and note are mine):

Adding a Q_DECLARE_METATYPE() makes the type known to all template based functions, including QVariant. Note that if you intend to use the type in queued signal and slot connections [1] or in QObject's property system, you also have to call qRegisterMetaType() since the names are resolved at runtime.

[1] Which happens on multi threading by default.

Upvotes: 0

Related Questions