Reputation: 148
I am trying to use fread and fwrite to read and write a data pertaining to a structure in a file. Here's my code:
#include<stdio.h>
#include<time.h>
#include<stdlib.h>
#include<string.h>
typedef struct book book;
struct book
{
char title[200];
char auth[200];
char publi[200];
int p_year;
int price;
int edition;
int isbn;
};
int main()
{
int i;
FILE* fp = fopen("this.dat","w");
book * a = calloc(1000000,sizeof (book));
srand(time(NULL));
for(i=0;i<1000000;i++)
{
a[i].price = rand()%1000;
a[i].p_year = 1500 + rand()%518;
a[i].isbn = 10000+rand()%100000;
a[i].edition = i%15;
strcpy(a[i].title,"title");
strcpy(a[i].auth,"author");
strcpy(a[i].publi,"publication");
}
if((i=fwrite(a,sizeof(*a),1000000,fp))!= 1000000)
{
printf("ERROR - Only %d records written\n",i);
printf("feof:%d\nferror:%d",feof(fp),ferror(fp));
return EXIT_FAILURE;
}
if(ferror(fp))
{
printf("ERROR");
return EXIT_FAILURE;
}
if(fclose(fp)!=0)
{
printf("ERROR while closing the stream");
return EXIT_FAILURE;
}
if((fp = fopen("this.dat","r")) == NULL)
{
printf("ERROR reopening");
return EXIT_FAILURE;
}
if((i=fread(a,sizeof(book),100,fp))!=100)
{
printf("ERROR - Only %d records read\n",i);
printf("feof:%d\nferror:%d",feof(fp),ferror(fp));
return EXIT_FAILURE;
}
if(ferror(fp))
{
printf("~ERROR");
return EXIT_FAILURE;
}
for(i=0;i<100;i++)
printf("price:%d\nedition:%d\nisbn:%d\np_year:%d\n\n\n",a[i].price,a[i].edition,a[i].isbn,a[i].p_year);
fclose(fp);
return EXIT_SUCCESS;
}
The thing is occasionally it executes successfully but most of the times it doesn't. I get an error while reading back from the file using fread
. It ends up reading variable number of records every time and less number of records than it's supposed to (i.e 100). Following is one of the outputs of an unsuccessful execution of the program:
ERROR - Only 25 records read
feof:16
ferror:0
Question 1: Why eof achieved reading just 25 records when more than 25 were written ? (I've tried using rewind
/fseek
after reopening the file but the issue still persisted.)
Question 2: In such cases, is it normal for the data contained in the array a
beyond a[x-1]
to get tampered when x
(<100) records are read ? Would the data still have been tampered beyond a[99]
even if 100 records were successfully read ? (I know the data gets tampered since trying to print the fields of elements of array a
beyond the xth
element results in inappropriate values, like price > 1000 or price<0 and so on)
Upvotes: 1
Views: 1213
Reputation: 140168
you shouldn't open your files in text mode while reading/writing as binary structures.
Whereas it has no effect on Linux/Unix, on Windows this has serious consequences. And it makes your files non-shareable between Windows and Linux.
Depending on the data LF <=> CR/LF conversion can corrupt/shift the data (removing the carriage return or inserting one)
in text mode in Windows, each LF (ASCII 10) byte is replaced by CR+LF (13+10 ASCII) bytes when writing (and reverse in reading: 13+10 => 10). Those 10 bytes can happen, for instance when writing year 1802 (hex: 0x70A) as binary.
Solution: use binary mode:
if((fp = fopen("this.dat","rb")) == NULL)
and
FILE* fp = fopen("this.dat","wb");
Note: In "text" mode, specifying a block size doesn't work since the size depends on the data. That probably answers your second question: last 100th record read is corrupt because you're reading too few bytes. I'm not sure about the details but since the system adds/removes bytes when writing/reading, block size can be buggy.
Upvotes: 1