Reputation: 33
I have a database containing pictures URLs that looks like this:
DATABASE :
id / url
1 / http://www.pic1.jpg
2 / http://www.pic2.jpg
and so on (+1000)
I want to run a script in order to download all of them on my LOCAL hdd. Thus I want to store them as C:/wamp/www/currentproject/pictures/pic_id.jpg.
i was wondering what could be the most simple way to do it. I wanted to run a PHP script in order to get the URLs from MySQL and then use a PHP command to download each of them, something like this (pseudo code):
while( $url = $response->fetch() ) {
"PHP download"($url, $localfolder, $name);
}
But I was not able to find such a function.
Also I heard about WGET, but I am not sure how to couple it with MySQL.
Upvotes: 2
Views: 1305
Reputation: 1150
set_time_limit(1000);
while( $url = $response->fetch() ) {
$my_image = file_get_contents('http://aaafg.com/im.jpg');
$my_file = fopen('path/myData.jpg','w+');// you must change image name
fwrite($my_file,$my_image);
fclose($my_file);
}
i did not test code (Note:You can convert this code to function)
Upvotes: 3