Lasse A Karlsen
Lasse A Karlsen

Reputation: 821

Simple MySQL-query takes ages

We are running a stats site for a bunch of games, but after the log has tipped over 100 megs, things are starting to slow down, so we need to optimize our queries.

However, we've found out that what we thought was a simple query, takes about 1.5 secs:

SELECT handle,
(
    SELECT COUNT(*)
    FROM kills
    WHERE killer=p.handle
    AND k_team != c_team
    AND gid=p.gid
) AS kills
FROM players as p
WHERE  gid="3245"
AND team="axis"
ORDER BY kills DESC LIMIT 0, 10;

This produces a result list for one of the teams.

Table kills and players consists of 36000 and 4000 rows respectfully.

Why is that query taking so long and how can it be optimize? Should we perhaps look into JOIN?

Best regards, Laka

Upvotes: 0

Views: 456

Answers (5)

Tom Schaefer
Tom Schaefer

Reputation: 897

For my experience the offset in the limitation (LIMIT 0,10) is the performance killer. If you do not limit and loop thru the resource only extracting the first ten record rows, then it will fasten the query dramatically. Why? You do not fetch the full resource, just shifting the resource pointer to the end of the resource. Only the first ten rows are affected, the others are thrown away. Does not matter how big a resource is. Just try it. You 'll see!

i.e. in PHP

$res=mysql_query("SELECT handle,
(
    SELECT COUNT(*)
    FROM kills
    WHERE killer=p.handle
    AND k_team != c_team
    AND gid=p.gid
) AS kills
FROM players as p
WHERE  gid="3245"
AND team="axis"
ORDER BY kills DESC;");

$i=0;
$results = array();
while($row=mysql_fetch_array($res)){
if($i<10){
  $results[] = $row;
}
$i++;
}

Only LINUX!

WINDOWS:

$i=0;
$results = array();
while($row=mysql_fetch_array($res)){
    if($i<10) {      
      $results[] = $row;
    } else {        
        mysql_close($db);
        break;
    }
     $i++;

}

Assuming indeces are setted well.

Upvotes: -1

John Douthat
John Douthat

Reputation: 41179

In general, MySQL performs joins faster than subselects. To learn how to optimize queries, I suggest reading up on the EXPLAIN syntax.

First, ensure your kills table has a compound index on killer and gid, then try this join:

SELECT p.handle, COUNT(*) AS n_kills
FROM players p 
    JOIN kills k 
    ON p.handle = k.killer
        AND p.gid = k.gid
WHERE p.gid = 3245 
    AND p.team = "axis"
    AND k.k_team != k.c_team
GROUP BY p.handle
ORDER BY n_kills DESC LIMIT 0,10

Seeing the CREATE TABLE statement for those two tables would help determine any issues with your indexes.

Upvotes: 3

Daniel A. White
Daniel A. White

Reputation: 190945

Have you tried putting an index on the kills table killer column?

Edit Info on indexes. http://www.w3schools.com/Sql/sql_create_index.asp

Upvotes: 1

cjk
cjk

Reputation: 46425

Try:

SELECT handle, count(*) as kills
FROM players as p
JOIN Kills as k ON k.gid = p.gid
WHERE  gid="3245"
AND team="axis"
ORDER BY kills DESC LIMIT 0, 10;

Upvotes: 0

jeffcook2150
jeffcook2150

Reputation: 4216

Yes, you should definitely look into joins. It's hard to tell from the snippet you posted, but whenever you can use a join over a subquery, it will be beneficial to do so.

You may also want to consider caching kill count somewhere else in your database; especially if you're using InnoDB, a COUNT() operation takes more time than simply SELECTing a [relatively] recent value from the database. You can probably implement this easily in the code by incrementing the kill count on the appropriate record, or something like that.

Upvotes: 0

Related Questions