Jürgen Paul
Jürgen Paul

Reputation: 15027

getimagesize() limiting file size for remote URL

I could use getimagesize() to validate an image, but the problem is what if the mischievous user puts a link to a 10GB random file then it would whack my production server's bandwidth. How do I limit the filesize getimagesize() is getting? (eg. 5MB max image size)

PS: I did research before asking.

Upvotes: 1

Views: 1946

Answers (3)

Ja͢ck
Ja͢ck

Reputation: 173642

You can download the file separately, imposing a maximum size you wish to download:

function mygetimagesize($url, $max_size = -1)
{
        // create temporary file to store data from $url
        if (false === ($tmpfname = tempnam(sys_get_temp_dir(), uniqid('mgis')))) {
                return false;
        }
        // open input and output
        if (false === ($in = fopen($url, 'rb')) || false === ($out = fopen($tmpfname, 'wb'))) {
                unlink($tmpfname);
                return false;
        }
        // copy at most $max_size bytes
        stream_copy_to_stream($in, $out, $max_size);

        // close input and output file
        fclose($in); fclose($out);

        // retrieve image information
        $info = getimagesize($tmpfname);

        // get rid of temporary file
        unlink($tmpfname);

        return $info;
}

Upvotes: 2

xdazz
xdazz

Reputation: 160893

Here is an example, you need to make some change to fit your requirement.

function getimagesize_limit($url, $limit)
{
 global $phpbb_root_path;
 $tmpfilename = tempnam($phpbb_root_path . 'store/', unique_id() . '-');
  $fp = fopen($url, 'r');
 if (!$fp) return false; 
 $tmpfile = fopen($tmpfilename, 'w');
  $size = 0;
 while (!feof($fp) && $size<$limit)
 {
  $content = fread($fp, 8192);
  $size += 8192;  fwrite($tmpfile, $content);
 }
  fclose($fp);
 fclose($tmpfile);
  $is = getimagesize($tmpfilename);
 unlink($tmpfilename);
 return $is;
}

Upvotes: 2

deceze
deceze

Reputation: 522442

You don't want to do something like getimagesize('http://example.com') to begin with, since this will download the image once, check the size, then discard the downloaded image data. That's a real waste of bandwidth.

So, separate the download process from the checking of the image size. For example, use fopen to open the image URL, read little by little and write it to a temporary file, keeping count of how much you have read. Once you cross 5MB and are still not finished reading, you stop and reject the image.

You could try to read the HTTP Content-Size header before starting the actual download to weed out obviously large files, but you cannot rely on it, since it can be spoofed or omitted.

Upvotes: 2

Related Questions