bruine
bruine

Reputation: 647

how to download files from links?

i search for links from a website with this code.

<?php

$ch = curl_init();
curl_setopt($ch, CURLOPT_URL,"http://example.com");
curl_setopt($ch, CURLOPT_TIMEOUT, 0);
curl_setopt($ch, CURLOPT_RETURNTRANSFER,1);
$result=curl_exec ($ch);
curl_close ($ch);

// search the results from the starting site
if( $result ){
   preg_match_all('/<a href="(http:\/\/www.[^0-9]+.pdf?)"/', $result, $output, PREG_SET_ORDER );
     foreach($output as $item  ){ 
        print_r($item );
      }
}
copy($item, 'file.pdf');
?>

just one pdf links that read. then i need a code to download pdf files that provide by links in php. copy function doesn't work. thank you :)

Upvotes: 0

Views: 463

Answers (2)

bruine
bruine

Reputation: 647

i've solved it, using this code, thank you for @Oldskool :) :

<?php
set_time_limit(0);
include 'simple_html_dom.php';
$url='example.com';
//set your save path here
$path = '/home/igos/pdfs/';

$html = file_get_html($url) or die ('invalid url');
foreach($html->find('a') as $e) {
     $link= $e->href;
     if (preg_match('/\.pdf$/i', $link)) {
          $result[] = $link;
          copy($link, $path . basename($link));
     }
}

?>

Upvotes: 1

Oldskool
Oldskool

Reputation: 34837

There are two problems here:

  1. You are only printing inside your foreach loop, not saving anything.
  2. You are using the copy() function with the static filename file.pdf.

You will probably want to save all the files within your foreach loop and with either the same name or something random (otherwise, each save operation would overwrite the previous file.pdf), something like this:

// Set your save path here
$path = '/home/igos/pdfs/';

foreach($output as $item){ 
    copy($item, $path . basename($item));
  }

That would save all the files, keeping their original filename to the /home/igos/pdfs/ folder.

Upvotes: 1

Related Questions