Reputation: 1233
Good afternoon. I have the following situation: there are three sets of data, each set is a two-dimensional table in which about 50 million fields. (~ 6000 lines and ~ 8000 columns). That data are stored in binary files Language - c + +
I only need to display this data. But I stuck when tried to read.(std::vector used but the waiting time is too long) What is the best way to read\store such amount of data? (std::vectors, simple pointers, special libraries)?
Maybe links to articles, books, or just personal experience?
Upvotes: 1
Views: 408
Reputation: 31290
There's no reason you shouldn't use plain old read and write on ifstream/ofstream. The following code doesn't take very long for a BigArray b( 6000, 8000 );
#include <fstream>
#include <iostream>
#include <string>
#include <stdlib.h>
class BigArray {
public:
BigArray( int r, int c ) : rows(r), cols(c){
data = (int*)malloc(rows*cols*sizeof(int));
if( NULL == data ){
std::cout << "ERROR\n";
}
}
virtual ~BigArray(){ free( data ); }
void fill( int n ){
int v = 0;
int * intptr = data;
for( int irow = 0; irow < rows; irow++ ){
for( int icol = 0; icol < cols; icol++ ){
*intptr++ = v++;
v %= n;
}
}
}
void readFromFile( std::string path ){
std::ifstream inf( path.c_str(), std::ifstream::binary );
inf.read( (char*)data, rows*cols*sizeof(*data) );
inf.close();
}
void writeToFile( std::string path ){
std::ofstream outf( path.c_str(), std::ifstream::binary );
outf.write( (char*)data, rows*cols*sizeof(*data) );
outf.close();
}
private:
int rows;
int cols;
int* data;
};
Upvotes: 1
Reputation: 17455
Well, if you don't need all this data at once, you may use a memory mapped file technique and read data as it was a giant array. Generally operating system / file system cache works well enough for most applications, but certainly YMMV.
Upvotes: 2