Erik van de Ven
Erik van de Ven

Reputation: 4985

Import CSV file in chunks with CakePHP

I'm trying to import a large CSV file, with about 23.000 rows, into my MySQL database. I can't import all rules at once, that just doesn't work. So I was wondering or it's possible to read the file in chunks, even while I'm using cakePHP transactions. This is my code so far:

// Get data source for transactions
$dataSource = $this->FeedImport->Product->getDataSource();

try{
    //Start Transactions
    $dataSource->begin();

    // Create an empty array for the CSV data
    $data = array();
    $i = 0;

    // read each data row in the file
    while (($row = fgetcsv($handle)) !== false) {
        // for each header field
        foreach ($header as $k=>$head) {
            // Remove any special characters from $head
            $head = preg_replace('/[^A-Za-z0-9\-]/', '', $head);
            if(array_key_exists($head, $this->fields)){
                //Check the row contains an image, if so, download
                if(preg_match('/\.(?:jpe?g|png|gif)$/i', $row[$k])){
                    foreach($this->fields[$head] as $table => $field){
                        $imageFileName = uniqid($supplier.'_');
                        $data[$i][$table][][$field] = $imageFileName.'.'.end(explode('.', $row[$k]));
                        $this->__importImg($row[$k]);
                    }
                }else{
                    foreach($this->fields[$head] as $table => $field){
                        if($table == 'Term'){
                            if(isset($row[$k]) && !$this->FeedImport->Product->Term->find('first', array('conditions' => array('Term.name' => $row[$k])))){
                                if(!$this->FeedImport->Product->Term->save(
                                    array(
                                        'name' => $row[$k]
                                    )
                                ));
                            }
                            if(isset($row[$k])) $term = $this->FeedImport->Product->Term->find('first', array('conditions' => array('Term.name' => $row[$k])));
                            $data[$i][$table][$table][$field] = (isset($term['Term']['term_id'])) ? $term['Term']['term_id'] : '';
                        }else{
                            $data[$i][$table][$field] = (isset($row[$k])) ? $row[$k] : '';
                        }
                    }
                }
            }
        }

        $data[$i]['Product']['product_id_supplier'] = $data[$i]['Product']['slug'];
        $data[$i]['Product']['supplier_id'] = $supplier;
        $data[$i]['Product']['feedimport_id'] = 1;

        $i++;
    }

    // save the row
    if (!$this->FeedImport->Product->saveAll($data)) {
        throw new Exception();
    }

} catch(Exception $e) {
    $dataSource->rollback($e);
}
$dataSource->commit();

I've putted the code above in a seperate function, so I can give a startline and endline for the while loop. But there's where I got stuck, I don't know how to set a start and end rule using fgetcsv. Can someone help me out here?

I've tried using fseek and such, but I just can't get it done... Can someone help me out here?

I considered using LOAD DATA INFILE to import those big productfeeds, but I don't think that's going to work nicely, cause I'm using multiple joining tables and some exceptions for importing data into several tables.. so that's too bad.

Upvotes: 0

Views: 2620

Answers (2)

riekelt
riekelt

Reputation: 510

A possible workaround would be the following

while (($row = fgetcsv($handle)) !== false) {
    if ($i === 5000){
        try {
            if (!$this->FeedImport->Product->saveAll($data))
                throw new Exception();
        } catch (Exception $e){
            $dataSource->rollback($e);
        }

        $i = 0;
        $data = [];
    }

    // Code
}

For each 5000 records it will commit the data, reset the counter and data array and continue.

Upvotes: 1

mark
mark

Reputation: 21743

Using PHP5.5 you can leverage the new Generator feature. See http://mark-story.com/posts/view/php-generators-a-useful-example

Upvotes: 1

Related Questions