Neovea
Neovea

Reputation: 514

Copy file from url to my own server remains file to 0mb

I'm facing to a problem and I'm not really sure if this is the right way of doing this. I need to copy a file from a remote server to my server with php. I use the following script :

public function download($file_source, $file_target) {
    $rh = fopen($file_source, 'rb');
    $wh = fopen($file_target, 'w+b');
    if (!$rh || !$wh) {
        return false;
    }

    while (!feof($rh)) {
        if (fwrite($wh, fread($rh, 4096)) === FALSE) {
            return false;
        }
        echo ' ';
        flush();
    }

    fclose($rh);
    fclose($wh);

    return true;
}

but in the end, the file size remains at 0.

EDIT : I update my question, because there are still some things I didn't understand : About fread, I used 2048mb. But it didn't work. I found the script above, which uses 4096mb.

My question : How to determine which quantity of memory (?) to use in order no get the file downloaded anytime ? Because this one works on a specific machine (dedicated), but will it on a shared host, if I cannot modify the php.ini ?

Thanks again

Upvotes: 0

Views: 370

Answers (1)

Marc B
Marc B

Reputation: 360662

filesize() expects a filename/path. You're passing in a filehandle, which means filesize will FAIL and return a boolean false.

You then use that false as the size argument for your fread, which gets translated to an integer 0. So essentially you're sitting there telling php to read a file, 0 bytes at a time.

You cannot reliably get the size of a remote file anyways, so just have fread some fixed number of bytes, e.g. 2048, at a time.

while(!feof($handle)) {
   $contents = fread($handle, 2048);
   fwrite($f, $contents);
}

and if that file isn't too big and/or your PHP can handle it:

file_put_contents('local.mp4', file_get_contents('http://whatever/foo.mp4'));

Upvotes: 1

Related Questions