kylex
kylex

Reputation: 14406

When querying large data sets, prevent script timeout

I have the following code:

$query = mysql_query("SELECT * FROM mytable");

while($row = mysql_fetch_assoc($query)){
    mysql_query("INSERT INTO mytable2 (col1, col2) 
                 VALUES ('".$row['val1']."', '".$row['val2']."')");
}

Understandably, the script times out at about 150,000 queries... outside of increasing the script memory what's the best way to prevent timeouts?

Upvotes: 3

Views: 1472

Answers (2)

RolandoMySQLDBA
RolandoMySQLDBA

Reputation: 44343

Why not run it as a single query ???

$SQL = "INSERT INTO mytable2 (col1,col2) SELECT val1,val2 FROM mytable";
$query = mysql_query($SQL); 

ALTERNATIVE

You could also throttle your INSERTs 200 at a time

$query = mysql_query("SELECT * FROM mytable"); 
$commit_count = 0;    
$commit_limit = 200;
$comma = "";
$SQL = "INSERT INTO mytable2 (col1, col2) VALUES ";
while($row = mysql_fetch_assoc($query)){ 
    $SQL .= $comma . "('".$row['val1']."','".$row['val2']."')";
    $comma = ",";
    $commit_count++;
    if ( $commit_count == $commit_limit )
    {
        mysql_query($SQL);
        $SQL = "INSERT INTO mytable2 (col1, col2) VALUES ";
        $commit_count = 0;    
        $comma = "";
    }
} 
if ( $commit_count > 0 ) { mysql_query($SQL); }

You can change the $commit_limit to whatever positive number that is reasonable.

Upvotes: 6

Mosty Mostacho
Mosty Mostacho

Reputation: 43434

You should consider using an INSERT ... SELECT statement instead of running lot's of single inserts.

Upvotes: 2

Related Questions