Roshan Wijesena
Roshan Wijesena

Reputation: 3136

break down the large number of rows in CSV and process them as a batch wise php

I am upto edit and upload a CSV with more then 50000 records (shopping cart produts). and it should update the number of tables in the system. So i m using zend framework with my shopping cart.

im planing to break them down (50000 CSV records) in memory before processing them batch-wise using PHP/MYSQL

please any one can give me advice on this

What im up to now is

public function getDataFromPath($path=null) {
        if($path == null) {
            $path = $this->_path;
        }
        ini_set("auto_detect_line_endings", 1);
        $fp = fopen($path, "r");
        while(($line = fgetcsv($fp, 5000, ",")) !== FALSE) {
            $line = self::trimArray($line);
            $this->data[] = $line;
        }
        fclose($fp);          
        return $this->data;
    }

regards roshan

Upvotes: 0

Views: 1303

Answers (2)

fancyPants
fancyPants

Reputation: 51888

Every RDBMS out there shouldn't have problems with 50.000 rows. That's nothing. There's no need to process them batch wise.

Just use the LOAD DATA INFILE command and you will be fine.

For an example see here: LOAD DATA INFILE easily convert YYYYMMDD to YYYY-MM-DD?

UPDATE (cause of comment to Ion Wood's answer): To create a CSV file you can use the SELECT .. INTO OUTFILE command.

SELECT a,b,a+b INTO OUTFILE '/tmp/result.txt'
FIELDS TERMINATED BY ',' OPTIONALLY ENCLOSED BY '"'
LINES TERMINATED BY '\n'
FROM test_table;

For more info, see the manual.

Upvotes: 1

Ian Wood
Ian Wood

Reputation: 6573

This is a job for...

DATABASE MAN!!!

Load your csv directly into a table using the load data infile business and do what ever magic you need to after that.

Upvotes: 1

Related Questions