Aaron Nguyen
Aaron Nguyen

Reputation: 195

How to speed up php backup script?

I have been using a mysql db backup script in php and the backups take 21 hours my dbs. I want to use it as a daily backup script through cli and cron and was wondering if you guys could take a look at it and see if there is anyway I can optimize it.

<?

//Timer start
$time = microtime();
$time = explode(' ', $time);
$time = $time[1] + $time[0];
$start = $time;


ini_set('memory_limit','4000M');
//ini_set('max_execution_time', 300);


$host = "host";
$user = "user";
$pass = "pass";

$db = "tagdb";

$link = mysql_connect($host,$user,$pass);
$result = mysql_query("show databases like 'tag%'");  // we only want tagdb

while($row = mysql_fetch_row($result))
{
    $dbs[] = $row[0];
}

foreach($dbs as $db)
{
        if(strlen($db) == 10 || $db == "tagdb" || $db == "tagui")
        {
        echo $db."\n";
                backup_tables($host,$user,$pass,$db);
    }
}


//backup_tables($host,$username,$password,$db);

/* backup the db OR just a table */
function backup_tables($host,$user,$pass,$name,$tables = '*')
{

      $fname = 'db-backup-'.$name.'_'.time().'.sql'; 
      echo $fname."\n";
      $handle = fopen($fname,'w+');
      $return = '';
      fwrite($handle,$return);
      fclose($handle);

      $link = mysql_connect($host,$user,$pass);
      mysql_select_db($name,$link);

      //get all of the tables
      if($tables == '*')
{
    $tables = array();
    $result = mysql_query('SHOW TABLES');
    while($row = @mysql_fetch_row($result))
    {
        $tables[] = $row[0];
    }
}
else
{
    $tables = is_array($tables) ? $tables : explode(',',$tables);
}

foreach($tables as $table)
{
  $handle = fopen($fname,'a');
  fwrite( $handle, 'DROP TABLE IF EXISTS '.$table.';' );

  $row2 = mysql_fetch_row(mysql_query('SHOW CREATE TABLE '.$table));
  fwrite( $handle, "\n\n".$row2[1].";\n\n" );

  $offset = 10000;

  $start = 0;
  do {
      $result = mysql_query( "SELECT * FROM ".$table." LIMIT ".$start.", ".$offset."" );
      $start += $offset;


      $num_rows = mysql_num_rows( $result );
      if (false === $result) {
        //close original file
        fclose($handle);
        //open error file
        $errfn = $fname.'.ERROR';
        $errf = fopen($errfn,'a');
        $err = mysql_error();
        fwrite( $errf, $err );
        fclose($errf);
        //reopen original file
        $handle = fopen($fname,'a');
        //break loop
        $num_rows = 0;
      }

      while( $row = mysql_fetch_row( $result ) ) {
          $line = 'INSERT INTO '.$table.' VALUES(';
          foreach( $row as $value ) {
              $value = addslashes( $value );
              @$value = ereg_replace( "\n","\\n", $value );
              $line .= '"'.$value.'",';
          }
          $line = substr( $line, 0, -1 ); // cut the final ','
          $line .= ");\n";
          fwrite( $handle, $line );
      }
  } while( $num_rows !== 0 );
  }

  $return="\n\n\n";
  fwrite($handle,$return);
  fclose($handle);
}

//End timer and output time
$time = microtime();
$time = explode(' ', $time);
$time = $time[1] + $time[0];
$finish = $time;
$total_time = round(($finish - $start), 4);
echo "\n Page generated in ".$total_time." seconds. \n";
?>

I was thinking about splitting the backups into parallel processes but my databases are huge and I'm not sure if php will run out of memory because it has before. Suggestions are welcome.

Thanks!

ps. I know there are better ways to do a backup but this particular script works well for me as the other options, such as mysqldump is not available to me.

Upvotes: 0

Views: 629

Answers (3)

LSerni
LSerni

Reputation: 57398

was utilizing mysqldump but realized that it was driving productivity down because of locks on myisam dbs and the backups were unsuable if i just removed the locks and continued backing up the dbs while they were in use. thats why i need something like a hotcopy backup for the myisam dbs

Correct me if I'm wrong, but your script does no locking at all. If it takes 21 hours to execute, then you've got to have synchronization issues.

If you can't use mysqldump, the only practical solutions I see are to use a MySQL slave (see: http://www.redips.net/mysql/add-new-slave/ - points 1 and 2 are what interests you) and mysqlhotcopy: http://dev.mysql.com/doc/refman/5.1/en/mysqlhotcopy.html .

The first solution is more complicated to set up, but it's probably cleaner and simpler (also, I think, faster) to run. And with a little more set-up, it can double as load-balancing on the SQL server.

Upvotes: 0

Matt
Matt

Reputation: 7040

If you're looking to back up your database, just use mysqldump as Dagon mentioned. Call it from a cron task on a schedule. Then if you need to roll back, you can simply find proper dump (stored in a .sql file) and execute the file.

mysqldump -u [username] -p[password] db_name > [timestamped_filename].sql

Read this article for more information.

Upvotes: 2

Parallelis
Parallelis

Reputation: 719

Yes you could directly copy a table in MySQL

CREATE TABLE new_table LIKE old_table;
INSERT new_table SELECT * FROM old_table;

It will be way faster, and you could also have your table on 2 separated database (NEW_DB & OLD_DB)

CREATE TABLE NEW_DB.new_table LIKE OLD_DB.old_table;
INSERT NEW_DB.new_table SELECT * FROM OLD_DB.old_table;

So you could do it from PHP, WITHOUT transferring data to PHP, with a huge gain in performance.

Upvotes: 6

Related Questions