Abhishek Thakur
Abhishek Thakur

Reputation: 17035

reading big data in C++

Im using C++ to read large files with over 30000 lines and 3000 colums. (30000 x 3000) matrix. im using a 2d vector to push the read data. But i need to do this process a couple of times. Is there any way to optimize the reading process?

Upvotes: 4

Views: 941

Answers (2)

Nayana Adassuriya
Nayana Adassuriya

Reputation: 24766

I Will give you some ideas but not exact solution. Because I do not know full details of your system.

  1. Actually if you have this much big file with data and only some data change in next reading. try to use some Data base methodology.
  2. For performance you can use concurrent file reading (read same file part by part by using multiple thread).
  3. If you need to process data as well, then use separate thread(s) to process and may possible to link by a queue or parallel queues.
  4. If your data length is fixed (such as fix length numbers). and if you know the changed location, try to read only changed data instead of reading and processing whole file again and again.
  5. if any of above not helped use memory mapping methodology. If you looking for portability, Boost Memory-Mapped Files will support you to reduce your works

Upvotes: 2

mumu
mumu

Reputation: 317

Memory map mechanism is Ok, since there are only reading operations.

Upvotes: 2

Related Questions