user2822838
user2822838

Reputation: 347

Reading-writing large data to a custom binary file in C++

I have a bit of code in C++ that writes structs to a file. The format of the struct is:

     struct dataHeader
     {
      int headerID;
      int numberOfDataLines;
      };

     struct data
     {
       double id;
       double type;
       char[100] name;
       char[100] yyy;
      };

Now, these two structs are always written in pairs and a file contains upwards of 50000 of these pairs.

My question is is there a way to do this more efficiently? The file size is a major concern to me.

EDIT: The current code is a simple fwrite in a loop (Psuedo-code) :

   while(dataBlock.Next())
   {
          fwrite(&_dataHeader, sizeof(dataHeader), 1, fpbinary); 

          while( dataLine.Next())
          {
            fwrite(&_data[i], sizeof(data), 1, fpbinary); 
          }  
   }

Thanks.

Upvotes: 1

Views: 653

Answers (2)

Semih Ozmen
Semih Ozmen

Reputation: 591

You may reduce you data storage requirements by grouping your data if they have similarities. For example you may prepare a list of "name" or "yyy" values and write your data in groups such that first the data values with names "Bob" and then "Josh".

If all of your data are unique, the only option left for you is to compress your binary data before writing into file and decompress it after you read it. I suggest you to use QuickLZ which is pretty fast for compression and decompression.

Upvotes: 2

Jose Palma
Jose Palma

Reputation: 756

You can try to compress the content of the file if the time requirements are not very high.

How can I easily compress and decompress files using zlib?

Upvotes: 1

Related Questions