Dawg
Dawg

Reputation: 69

Querying a 100k record table, execution time exceeded

Every member has multiple records stored in the database. I need to sum a column for every user and get the highest / lowest value from another column. The table has over 100k records, one user might have more than 2k records in the table.

I tried this:

$query = $mysqli->query("SELECT DISTINCT `id` FROM `table`");

if($query){
  $IDs = new SplFixedArray($query->num_rows);
  $IDs = $query->fetch_all();
  unset($query, $row);


  set_time_limit(300);
  foreach ($IDs as $key => $value) {
    $query = $mysqli->query("SELECT SUM(price), dtime FROM `table` WHERE `id` = '".$value[0]."' ORDER BY dtime DESC");
    if($query){
      $row = $query->fetch_assoc();
      print_r($row);
    }
  }

But is setting the time limit to 300 really the proper way doing this? I also tried a prepared statement, only assigning the ID in the loop, and several other things. All of which aren't working as I'd whish.

Upvotes: 1

Views: 1187

Answers (1)

Jeff Breadner
Jeff Breadner

Reputation: 1438

100k records really isn't that many, there should be no reason for this query to take longer than 5 minutes.

Instead of getting a distinct list of IDs and iterating through them, querying these values for each ID, it would probably be better to do everything all at once, then iterate over your results to do what you need to.

select 
  `id`, 
  sum(`price`) as `sum_price`, 
  min(`dtime`) as `min_dtime`, 
  max(`dtime`) as `max_dtime` 
from 
  `table` 
group by 
  `id`

(this is assuming that the "other field" that you need to get the min and max of is dtime)

I'm not strong on the PHP side though, but from a SQL perspective it's much, much faster to do things this way.

Upvotes: 1

Related Questions