Aka
Aka

Reputation: 147

Why there is always 4.5 times difference between RAM and hard disk size of a cv::Mat?

First of all, I use C++. I have a CV_32F cv::Mat and when I write it down to disk using FileStorage, the size of this Mat becomes around 4.5 times higher than the size of it when it was on RAM during the program execution. I do some experiments and each time it is like that. So, when I tried to read it again, obviously my RAM(6 GB) becomes insufficient ,though it was not during the program execution.

Here is how I write it down to the disk:

FileStorage fs( PATH, FileStorage::WRITE);
fs << "concatMat" << concatMat;
fs.release();

And this is how I calculate the occupied RAM size during the program execution:

size_t sz= sizeof( concatMat) + concatMat.total()*sizeof( CV_32F);

I wonder the reason behind this, especially why there is always 4.5 times difference?

EDIT: I save them with .bin extension, not YAML or XML. I need to save them efficiently and can take recommendations.

Upvotes: 2

Views: 129

Answers (1)

Sunreef
Sunreef

Reputation: 4542

Take a look at the contents of your XML or YML or .bin file with Notepad++. (By the way, if you specify a path ending in .bin, OpenCV will write it in a YAML format...)

You will see that each float from your CV_32F Mat has been written in a format like this 6.49999976e-001. This represents 15 bytes instead of the 4 bytes expected for a float. This is a ratio of 15 / 4 = 3.75. If you add to that all the characters for formatting like ',' '\n' or ' ', you may reach a size that is more than 4 times bigger than what you had on RAM.

If you try to save a Mat with only zeros inside, you will see that the size is quite similar to what you had in RAM because the zeros are written 0.. It is actually smaller if you save it in XML format.

Upvotes: 2

Related Questions