Reputation: 1321
I have 6GB free physical memory. Im working with a big data, with a size around 4GB. I have just realised that i cant save it the way, i used to do it with smaller data(~1GB):
void save(char f_name[], int size, float data[])
{
std::fstream f_bin(f_name, std::ios::out|std::ios::binary);
f_bin.seekg(std::ios::beg);
f_bin.write((char*)data, size*sizeof(float));
f_bin.close();
}
becouse it takes the data in memory and it needs the same amount of RAM to write it to the HD. Is there a way to write the 4GB in (e.g. 1GB) chunks, so it will never exceed the 6GB limit?
Upvotes: 1
Views: 1009
Reputation: 2071
How about something to the effect of
void save(char f_name[], int size, float data[])
{
std::fstream f_bin(f_name, std::ios::out|std::ios::binary);
f_bin.seekp(std::ios::beg);
while (size > 0)
{
int amount = std::min(1000, size);
f_bin.write((char*)data, amount * sizeof(float));
data += amount;
size -= amount;
}
f_bin.close();
}
Upvotes: 1