Jordan Davis
Jordan Davis

Reputation: 1520

Using fgets( ) - the most memory efficient way

Looking around the online I have yet to find a definitive answer related to the fgets() character pointer array (char *restrict s) in terms of efficient memory allocation.

synopsis: char *fgets(char *restrict s, int n, FILE *restrict stream);

Looking at the fgets() - specification

From my understanding of this spec, you should be allocating based on the LINE_MAX macro defined in the <limits.h> - specification, simply because you don't know how many characters each line has.

If I run - printf("LINE_MAX BYTES: %d\n", LINE_MAX); the result is 2048.

That being said, declaring char *line[LINE_MAX] -or- char *line[2048] seems inefficiently large to me, however, this may be the best way of doing it?

//PROGRAM

#include <stdio.h>                                                                                                  
#include <limits.h>                                                                                                 
int main(void){                                                                                                     
    char line[LINE_MAX];                                                                                   
    FILE *fp = fopen("file.txt", "r");                                                                          
    while(fgets(line, LINE_MAX + 1, fp)){                                                                           
        printf("%s", line);                                                                               
    }                                                                                                    
    fclose(fp);                                                                                                 
    return(0); 
}  

//FILE (file.txt)

Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. 
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua.
Lorem ipsum dolor sit amet, consectetur adipiscing elit, sed do eiusmod tempor incididunt ut labore et dolore magna aliqua. 

Upvotes: 1

Views: 1351

Answers (2)

Sergey Kalinichenko
Sergey Kalinichenko

Reputation: 726479

Unless your file has some special properties, allocating LINE_MAX bytes is the best approach. Although you may be wasting some memory there, allocation in the automatic memory is really cheap, because the space is being "reserved" more than "allocated", with lots of hardware support on most modern architectures.

On the other hand, if you know that due to the format of your file the lines cannot exceed a certain length MY_LINE, you could use MY_LINE+2* as your limit instead:

char line[MY_LINE+2];

For example, a program reading a file in uuendode format needs at most 62 characters per line, so it could define

#define MY_LINE 62

* Assuming you're on UNIX, you need space for '\n' and '\0', hence the +2 part. If you are on Windows, do +3 to accommodate the additional '\r' character.

Upvotes: 4

user3629249
user3629249

Reputation: 16540

The following code:

cleanly compiles
performs error checking

If a line is too long, the partial line will be echo'd then the next call to fgets() will obtain some more of the line, which will be echo'd, ect. No output will be lost and no undefined behaviour will be performed.

you could use 'getline()' instead of fgets(), but why bother. Using getline() will make the code more complex, without adding any functionality and (as others have commented) a 2048 byte buffer is trivial on todays' computers.

If your really worried about the buffer size. The following could use a buffer as small as 2 bytes for linux/mac (3 bytes for windows/DOS) and still work correctly

#include <stdio.h>
#include <stdlib.h>

#define LINE_MAX (2048)

int main(void)
{
#include <stdio.h>
#include <stdlib.h>

#define LINE_MAX (2048)

int main(void)
{

    char line[LINE_MAX];

    FILE *fp = NULL;
    if( NULL ==( fp = fopen("file.txt", "r") ) )
    {
        perror( "fopen for file.txt for read failed");
        exit( EXIT_FAILURE );
    }

    while( fgets(line, LINE_MAX, fp) )
    {
        printf("%s\n", line);
    }

    fclose(fp);
    return(0);
}

    char line[LINE_MAX];

    FILE *fp = NULL;
    if( NULL ==( fp = fopen("file.txt", "r") ) )
    {
        perror( "fopen for file.txt for read failed");
        exit( EXIT_FAILURE );
    }

    while( fgets(line, LINE_MAX, fp) )
    {
        printf("%s", line);
    }

    fclose(fp);
    return(0);
}

Upvotes: 0

Related Questions