Reputation: 5433
I have a piece of code that is designed to receive any URL and rip it down from the web. So far it's been working fine, until someone gave it this URL:
http://www.aspensurgical.com/static/images/aspen_hill-rom_logo.png
If I hit it from my browser, it shows just fine. But when I try to CURL it down, I get:
<!DOCTYPE HTML PUBLIC "-//IETF//DTD HTML 2.0//EN">
<html><head>
<title>403 Forbidden</title>
</head><body>
<h1>Forbidden</h1>
<p>You don't have permission to access /static/images/aspen_hill-rom_logo.png
on this server.</p>
<hr>
<address> Server at www.aspensurgical.com Port 80</address>
</body></html>
The CURL code I'm using is:
$ch = curl_init(str_replace(' ', '%20', $url));
$fh = fopen($local_file, "w");
curl_setopt($ch, CURLOPT_FILE, $fh);
curl_exec($ch);
curl_close($ch);
Is their server somehow realizing I'm not a normal browser and booting me?
Upvotes: 5
Views: 9624
Reputation: 1
i had the same problem fix
<form action="gc.php" method="post" enctype="multipart/form-data">
<input type="file" name="f">
<input type="submit" value="Post">
</form>
<?php
if(isset($_FILES['f']['tmp_name'])) {
$ch = curl_init();
$cfile = new CURLFile($_FILES['f']['tmp_name'], $_FILES['f']['type'],
$_FILES['f']['name']);
$data = array("myfile" =>$cfile);
curl_setopt($ch, CURLOPT_URL, "http://your-url/accept.php");
curl_setopt($ch, CURLOPT_HEADER, true);
curl_setopt($ch, CURLOPT_NOBODY, true);
curl_setopt($ch, CURLOPT_REFERER, $url);
curl_setopt($ch, CURLOPT_FOLLOWLOCATION, true);
curl_setopt($ch, CURLOPT_MAXREDIRS, 15);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 60);
curl_setopt($ch, CURLOPT_FAILONERROR, true);
curl_setopt($ch, CURLOPT_POST, true);
curl_setopt($ch, CURLOPT_POSTFIELDS, $data);
$response = curl_exec($ch);
if($response == true) {
echo "File posted";
} else {
echo "Error" . curl_error($ch);
}
}
?>
and accept.php
<?php
if(isset($_FILES['myfile']['tmp_name'])) {
$path = "myfiles/" . $_FILES['myfile']['name'];
move_uploaded_file($_FILES['myfile']['tmp_name'], $path);
}
?>
Upvotes: 0
Reputation: 21
Some servers, in order to block unnecessary traffic, allow any download only from a browser. So, to dupe such servers, curl has an additional option of --user-agent , which does the trick!
I use curl from my windows7 PC, have installed gow.
Example
curl --user-agent "Mozilla/4.0" http://www.example.com/archives/abc.txt --output pqr.txt
Upvotes: 2