Reputation: 3713
I need to periodically backup an image file (as is) from my public server onto my local dev server..
What's the best way to go about it? a simple combination of file_get_contents
and file_put_contents
? or is there some other ways to do this?
and if it's file_get_contents()
way, how does one get the contents of an image file properly/exactly in order to simply write it via file_put_contents because i don't think a simple file read is enough. (current tests are writing blank zero byte jpgs).
regards
Upvotes: 0
Views: 1221
Reputation: 97
It is not recommended to use file_get_contents
to copy big files, as it loads the entire file to memory and could cause a crash if memory is short.
A better solution is to read it by chunks:
function getUrlFile($url, $path)
{
$newfilename = $path;
$file = fopen($url, "rb");
if ($file) {
$newfile = fopen($newfilename, "wb");
if ($newfile)
while (!feof($file)) {
fwrite($newfile, fread($file, 1024 * 8), 1024 * 8);
}
}
if ($file) {
fclose($file);
}
if ($newfile) {
fclose($newfile);
}
}
Upvotes: 1
Reputation: 11375
Multiple ways you can do this (if we don't restrict ourselves to PHP)
The script runs a rsync
or scp
command to copy the file over via SSH.
scp -i ~/.ssh/id_rsa /var/www/html/image.png foo@local-dev-machine:/var/www/html
A URL can be used as a filename with this function if the fopen wrappers have been enabled. See fopen() for more details [...]
$url = 'http://hostname/images/image.png';
file_put_contents('/var/www/html/image.png', file_get_contents($url));
$ch = curl_init ($url);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_BINARYTRANSFER,1);
$raw=curl_exec($ch);
curl_close ($ch);
if(file_exists($saveto)){
unlink($saveto);
}
$fp = fopen($saveto,'x');
fwrite($fp, $raw);
fclose($fp);
Upvotes: 1
Reputation: 5094
As you already know
Use file_get_contents() and file_put_contents
$url = 'http://hostname/images/wml.gif';
file_put_contents('/your_path/image.gif', file_get_contents($url));
Upvotes: 2