Reputation:
I have a mysql database. It's very huge database. When I select the data with 1M records, I should make the csv file on the disk. I make the PHP script. But It's killed by Linux. How can I make the PHP script with fast speed?
$batches = $itemcount / 50000;
for ($i = 0; $i <= $batches; $i++) {
$offset = $i * 50000;
$sql = $sql_org . " LIMIT $offset, 50000 ";
$stmt = $db->prepare($sql);
if($stmt) {
if($stmt->execute()) {
$stmt->bind_result($FIRSTNAME, $LASTNAME, $PHONE....
Upvotes: 1
Views: 168
Reputation: 605
Install mysql client and piping out the output
shell_exec('mysql -u username -p "password" --database=dbname --host=AWShostname --port=AWSport --batch
-e "select * from `table`"
| sed #s/\t/","/g;s/^/"/;s/$/"/;s/\n//g# > /path/to/yourlocalfilename.csv')
Note: exporting huge amount of data from AWS is to expensive.
Upvotes: 1