Reputation: 4333
In my program I load some images, extract some features from them and use a cv::Mat
to store these features. Based on the number of images I know the cv::Mat
will be 700.000 x 256 in size (rows x cols) which is about 720Mb. But when I run my program when it gets about 400.000 x 256 (400Mb) and tries to add more it simply crashes with a Fatal Error. Can anyone confirm that indeed 400Mb is the limit of cv::Mat
's storage capacity? Should I check for more issues? Possible ways to overcome this problem?
Upvotes: 5
Views: 5058
Reputation: 1651
Digging the source code, by using the push_back
:
it checks if there is enough space for a new element, if not, it reallocates the matrix, with space for (current_size * 3 + 1) / 2 (see here). In your example, by around 400,000 * 256 (total of 102,400,000 elements) it tries another allocation, so it tries to allocate space for 307,200,001 / 2 = 153,600,000 elements. But in order to move this, it needs to allocate a new space and then copy the data
From matrix.cpp:
Mat m(dims, size.p, type());
size.p[0] = r;
if( r > 0 )
{
Mat mpart = m.rowRange(0, r);
copyTo(mpart);
}
*this = m;
So it essentially:
Meaning that, in your case, it needs enough space for (600,000 + 400,000) * 256 - 1GB of data, using 4 bytes integers. But also, it creates an auxiliary matrix of one row and, in this case, 600,000 columns, which accounts for 2,400,000 extra bytes.
So, by the next iteration, when it reaches the 600,000 columns, it tries to allocate 900,000x256 elements (900Mb) + the 600,000x256 elements (600Mb) + 600,000 (~3.4Mb). So, just by allocating this way (using push_back), you are doing several reallocations.
In other words: since you already know the approximate size of the matrix, using reserve
is a must. It is several times faster (will avoid reallocations and copies).
Also, as a workaround, you could try inserting to the transposed matrix and then after the process is done, transpose it again.
Side question: shouldn't this implementation use realloc
instead of malloc
/memcpy
?
Upvotes: 4
Reputation: 7103
Created a matrix as follows. Used CV_8UC4, as it gives roughly 700 MB. No problem whatsoever. So no, 400Mb is not the limit. 700Mb is not the limit. Tried it with twice as much (1400000 rows, 1.4Gb) - still not the limit (my default image viewer could not display the resulting BMP file, though).
const unsigned int N_rows = 700000;
const unsigned int N_cols = 256;
cv::Mat m(N_rows, N_cols, CV_8UC4);
for (int r = 0; r < m.rows; ++r)
{
for (int c = 0; c < m.cols; ++c)
{
m.data[(r*N_cols + c) * 4] = c % 256;
}
}
cv::imwrite("test.bmp", m);
Possible ways to overcome the problem:
cv::Mat
at the beginning, maybe even with a little extra, to make sure. If your problem is caused by re-allocations, this will help.Upvotes: 1
Reputation: 1707
There is no strict limit on the size of a cv::Mat
. You should be able to allocate memory as long as it is available.
Here is a small program that shows what can happen to the data pointer when running cv::Mat::push_back
a number of times. Playing around with the values for rows
and cols
can result in one or many values printed for a.data
before eventually an out-of-memory exception is thrown.
#include <opencv2/opencv.hpp>
int main()
{
int rows = 1, cols = 10000;
cv::Mat a(rows, cols, CV_8UC1);
while(1) {
a.push_back(cv::Mat(1, cols, CV_8UC1));
std::cout << (size_t) a.data << std::endl;
}
}
It really depends on the allocator on what the above code does for various values of rows and cols. So, consideration should be given for small and large initial sizes for a
.
Remember that like the C++11 std::vector
, the elements in a cv::Mat
are contiguous. Access to the underlying data can be obtained through the cv::Mat::data
member. Calling std::vector::push_back
or cv::Mat::push_back
continuously may result in a reallocation of the underlying memory. In this case, the memory has to be moved to a new address and roughly twice the amount of memory may be necessary to move old to new (baring any tricky algorithm that utilizes less memory).
Upvotes: 1