Reputation: 7089
I have a simulation in which I need to dump large amount of data, usually on the order ~50GB or so, while the code is running, which usually happens on the order of weeks. Right now, I am exporting the data as legacy ASCII VTK
files.
I was wondering if there is any way to compress the data as it is being written to the disk so I can save some space (I need to run multiple versions of the code at the same time). I prefer something from the standard library if possible.
Upvotes: 2
Views: 525
Reputation: 24174
if you can use boost, look at the zlib filter compressor and decompressor
#include <fstream>
#include <iostream>
#include <boost/iostreams/filtering_streambuf.hpp>
#include <boost/iostreams/copy.hpp>
#include <boost/iostreams/filter/zlib.hpp>
int main()
{
using namespace std;
ifstream file("hello.z", ios_base::in | ios_base::binary);
filtering_streambuf<input> in;
in.push(zlib_decompressor());
in.push(file);
boost::iostreams::copy(in, cout);
}
Upvotes: 3