Tom Tetlaw
Tom Tetlaw

Reputation: 113

Reading files in 64kb blocks

I want to create a function that Copies a file to some location. I'm wondering weather it would be beneficial to read it in in 64kb blocks? Or should I just dynamically allocate the buffer? Or should I just use the system() function to do it on the command line?

I mean like this:

int copy_file(const char *source, const char *dest)
{
    FILE *fsource, *fdest;
    int readSize;
    unsigned char buffer[64*1024]; //64kb in size

    fsource = fopen(source, "rb");
    fdest = fopen(dest, "wb");
    if(!fsource)
        return 0;
    if(!fdest)
    {
        fclose(fsource);
        return 0;
    }

    while(1)
    {
        readSize = fread(buffer, 1, sizeof(buffer), fsource);
        if(!readSize)
            break;
        fwrite(buffer, 1, readSize, fdest);
    }

    fclose(fsource);
    fclose(fdest);
    return 1;
}

Upvotes: 2

Views: 452

Answers (1)

matzahboy
matzahboy

Reputation: 3024

The optimal read size is going to be very platform dependent. A power of 2 is definitely a good idea, but without testing, it would be hard to say which size would be best.

If you want to see how cp copies files, you can see the bleeding edge source code

Upvotes: 3

Related Questions