Ian
Ian

Reputation: 865

Pass multiple URLS into a PHP script

I've got this script that will generate a thumbnail for any URL passed through it (i.e. my_script.php?url=http://www.google.com

It works, but I want to modify it so that I can pass a large amount (2,100) of URLS through it and generate a screen shot for each of them. It already saves these images in separate folders as well.

Here is the pertinent code:

// If we're requesting
if ($_GET['auth'] == 'todc_admin') {
    if (isset($_GET['id'])) {
        $wb = new WebThumb();
        $wb->setApi("API-KEY-HERE");
        $job = $wb->requestThumbnail($_GET['url']);
        $job_id = $job[0]['id'];
        while (true) {
            $job_status = $wb->requestStatus($job_id);
            $status = $job_status['status'];
            if ($status == "Complete") {
                break;
            } else {
                sleep(5);
                continue;
            }
        }
        // Image generated so lets save it
        //header("Content-type: image/jpeg");
        $wb = new WebThumb();
        $wb->setApi("API-KEY-HERE");
        $filename = ABSPATH . todc_design_dir($_GET['id']) .'/thumbnail.jpg';
        $imgurl = todc_design_dir($_GET['id']) .'/thumbnail.jpg';
        $content = $wb->getThumbnail($job_id, "large");
        if (file_put_contents($filename, $content)) {
            echo '<img src="http://www.myurl.com'. $imgurl .'" />';
        }
    }
}

I'm also able to generate a list of all the urls I need to create thumbnails for using this:

$designs = get_posts( array('post_type' => 'design', 'post_status' => 'publish', 'orderby' => 'date', 'showposts' => -1) );

    foreach($designs as $design) { 

            $previewlink = get_bloginfo('wpurl') . todc_design_dir($design->ID)

Then echo $previewlink wherever I need it.

I'm just struggling to put the two together.

Any thoughts?

Upvotes: 0

Views: 113

Answers (2)

FThompson
FThompson

Reputation: 28687

You could pass the urls as a json-encoded array, which you then json_decode into an array in the script. You then would use a for-each to iterate through each url.

Also, you should use POST for such a large amount of data, as GET has a maximum datasize limit.

$urls = json_decode($_POST['url']);
foreach ($urls as $url) {
    $job = $wb->requestThumbnail($url);
    // rest of code
}

You may also need to increase the script's max execution time, depending on how long 100 urls would take to process; use set_time_limit(int $seconds) for this.

Upvotes: 2

Martin Sommervold
Martin Sommervold

Reputation: 152

first thought is that this sounds process intensive. Doing it through your web browser is prone to let php max out on memory and time limits. A better option could be to store the urls in a database and run the batch as a forked process.

Upvotes: 0

Related Questions