Reputation: 7954
I have a table with around 100000 records ,the structure is shown below
id | name | desc | length | breadth | -------------|remark //upt o 56 fields
1 FRT-100 -desc- 10 10 remarking
----------------------------------------------------------------
----------------------------------------------------------------
what iam doing this is using a cronjob(Gearman) to write all these data to a csv ,my code is given below
<?php
set_time_limit (0);
ini_set("memory_limit","2048M");
//get the total count of records,so that we can loop it in small chunks
$query = "SELECT COUNT(*) AS cnt FROM tablename WHERE company_id = $companyid";
$result = $link->query($query);
$count = 0;
while ($row = mysqli_fetch_array($result)) {
$count = $row["cnt"];
}
if ($count > 1000) {
$loop = ceil($count / 1000);
} else {
$loop = 1;
}
// im going to write it in small chunks of 1000's each time to avoid time out
for ($ii = 1; $ii <= $loop; $ii++) {
if ($ii == 1) {
$s = 1;
} else {
$s = floatval(($ii * 1000) - 1000);
}
$q = "SELECT * FROM datas WHERE group_company_id = $companyid LIMIT 1000 OFFSET $s";
$r = $link->query($q);
while ($row2 = mysqli_fetch_array($r)) {
//my csv writing will be done here and its working fine for records up to 10,000 ~ 12,000 after than memory exhaustion occours
}
}
?>
I strongly suspects something can be optimised in the offset function of mysql .Can someone show me a better way to optimise it ? open to any suggestions (CRON,third party libraries ..etc)
Upvotes: 0
Views: 110
Reputation: 869
Try and avoid storing everything in memory at once, instead load each row, then write out result one row at a time.
<?php
$q = "SELECT * FROM datas";
$r = $link->query($q);
$fp = fopen("out.csv","w+");
// Or you could just set the headers for content type, and echo the output
while ($row2 = mysqli_fetch_array($r)) {
fwrite($fp, implode(",",$row2)."\n");
}
fclose($fp);
This should solve the issue, nothing is being stored in memory.
Upvotes: 2