Pjottur
Pjottur

Reputation: 622

Release memory after mysql query

I am using CodeIgniter to import large tables from remote servers. I am splitting the queries up in chunks of 1000 rows at a time.

This will be a cron script that runs in the background.

$userDB is the local DB object, while $remoteDB is the remote DB object

private function fetch_remote_db_table($remoteDB, $remoteTable, $localTable = FALSE){
    $success = FALSE;
    $returnObj = (object) array();

    //Use remote table name if local name not set.
    $localTable = ($localTable === FALSE) ? $remoteTable : $localTable;

    //Set the execution time and memory limit high, since this might take some work for the script
    ini_set('max_execution_time', 0);
    ini_set('memory_limit', '16000M');

    //Start by truncating the local table, which will be overwritten
    $this->userDB->truncate($localTable);

    //Get the remote table. This takes some time. Split up in chunks of 1000 rows
    $continue = TRUE;
    $counter = 0;
    while($continue){
        $limit = 1000;
        $offset = $counter*$limit;

        //Don't include offset in query if it's 0; CI will add it and break the query.
        if($offset == 0){
            $remoteDB->limit($limit);
        } else {
            $remoteDB->limit($limit, $offset);
        }

        $query = $remoteDB->get($remoteTable);
        $result = $query->result_array();

        if($query->num_rows() > 0){
            //Insert the remote data into the local table.
            if(!$this->userDB->insert_batch($localTable, $result)){$success = FALSE;}
        } else {$continue = FALSE;}
        $counter ++;
    }
    $this->output->enable_profiler(TRUE);
    var_dump(get_defined_vars());

    return $success;
}

My problem is that with every iteration, the results stay in the memory. How do I go about clearing the result from memory after each iteration? Ideally, I'd like to keep the metadata from the query, but just remove all the row data.

Upvotes: 4

Views: 2284

Answers (1)

Pjottur
Pjottur

Reputation: 622

OK, lesson learned. To whomever might experience the same issue:

CI insert queries can also be saved and take up memory.

In my setup I had: $remoteDB which was my remote connection I used to retrieve the data and $userDB which was the connection to my local DB, where I inserted the data. This one had

 $userDB->save_queries = TRUE;

And took up my memory without me seeing any variables being set.

after setting this to FALSE before the loop, I could handle as much data as I wanted without draining the memory. And I get to keep the query metadata.

Upvotes: 2

Related Questions