Reputation: 21627
Using PHP I'm trying to crawl a website page and then grab an image automatically.
I've tried the following:
<?php
$url = "http://www.domain.co.uk/news/local-news";
$str = file_get_contents($url);
?>
and
<?php
$opts = array('http'=>array('header' => "User-Agent:Mozilla/5.0 (Windows NT 6.2) AppleWebKit/537.1 (KHTML, like Gecko) Chrome/21.0.1180.75 Safari/537.1\r\n"));
$context = stream_context_create($opts);
$header = file_get_contents('http://www.domain.co.uk/news/local-news',false,$context);
?>
and also
<?php
include('simple_html_dom.php');
$html = file_get_html('http://www.domain.co.uk/news/local-news');
$result = $html->find('section article img', 0)->outertext;
?>
but these all return with Internal Server Error
. I can view the site perfectly in the browser but when I try to grab the page in PHP it fails.
Is there anything I can try?
Upvotes: 2
Views: 3660
Reputation: 6379
Sometimes you might get an error opening an http URL with file_get_contents.
even though you have set allow_url_fopen = On
in php.ini
For me the the solution was to also set "user_agent" to something.
Upvotes: 1
Reputation: 1464
Try below code: It will save content in local file.
<?php
$ch = curl_init("http://www.domain.co.uk/news/local-news");
$fp = fopen("localfile.html", "w");
curl_setopt($ch, CURLOPT_FILE, $fp);
curl_setopt($ch, CURLOPT_HEADER, 0);
curl_exec($ch);
curl_close($ch);
fclose($fp);
?>
Now you can ready localfile.html.
Upvotes: 2