Hakkim Ansari
Hakkim Ansari

Reputation: 119

What is an efficient way to map a file to array of structures?

I have a file in my system with 1024 rows,

student_db.txt

    Name   Subject-1  Subject-2  Subject-3
    -----  ---------  ---------  ---------
    Alex     98         90         80
    Bob      87         95         73
    Mark     90         83         92
    ....     ..         ..         ..
    ....     ..         ..         ..

I have an array structures in my C code,

typedef struct
{
  char name[10];
  int  sub1;
  int  sub2;
  int  sub3;
} student_db;

student_db  stud_db[1024];

What is the efficient way to read this file and mapping to this array of structures ?

If the number entries is less then we can go for normal fgets in a while with strtok but here number of entries is 1024.

So please suggest some efficient way to do this task.

Upvotes: 1

Views: 111

Answers (2)

DawidPi
DawidPi

Reputation: 2365

You can try to store data in a binary way. The way file is written now is character way. So everything is represented as a character (numbers, strings and other things).

If you store things binary, then it means you store numbers, strings and other stuff the way they are, so when you write integer to a file you write a number, when you open this in a text editor later, then you will see character, which corresponding number in eg. ASCII is the one you have written to a file.

You use standard functions to store things in a binary way:

fopen("test.bin","wb");

w stands for write and b stands for binary format. And function to write is:

fwrite(&someInt, sizeof(someInt), 1, &someFile);

where someInt is variable you want to write (function takes pointer), sizeof(someInt) is sizeof element int, 1 stands for number of elements if first argument is an array, and someFile is a file, where you want to store your data.


This way size of the file can be reduced, so loading of the file also will be faster. It's also simplier to process sata in file

Upvotes: 0

ern0
ern0

Reputation: 3172

  1. Check the size of the file, I think, it is max. 100 KByte. This is literally nothing, even a poorly written PHP script can read it in some millisec. There's no slow method to load such small quantity of data.

  2. I assume, loading this file is only the first step, the real task will be to process this list (search, filter etc.). Instead of optimizing the loading speed, you should focus on the processing speed.

  3. Premature optimization is evil. Make a working unoptimized code, see if you're satisfied with the result and speed. Probably you never should optimize it.

Upvotes: 5

Related Questions