termux
termux

Reputation: 341

Attempting to speed mysql queries on large tables

I am hitting some performance issues on a Mysql server.

I am trying to query a large table (~500k rows) for a subset of data:

SELECT * FROM `my_table` WHERE `subset_id` = id_value;

This request takes ~80ms to achieve, but I am trying to query it over 20k "id_value", which makes the total execution time of almost 1h. I was hopping that adding an index on subset_id would help, but it's not changing anything (understanding how indexes work, it makes sense).

What I am trying to figure out is if there is any way to "index" the table in a way it wouldn't take 80ms to execute this query but something more reasonable? Or in other work, is ~80ms for querying a 500k rows table "normal"?

Note: On the larger picture, I am using parallel queries and multiple connections to speed up the process, and tried various optimizations changing the innodb_buffer size. I'm also considering using a larger object querying the db once for the 500k rows instead of 20k*xx but having my code designed in a multiprocessed/co-routines/scalable way, I was trying to avoid this and focusing on optimizing the query/mysql server at the lowest level.

Thanks!

Upvotes: 0

Views: 189

Answers (1)

Gordon Linoff
Gordon Linoff

Reputation: 1269803

Use a single query with IN rather than a zillion queries:

SELECT *
FROM `my_table`
WHERE `subset_id` IN (id1, id2, . . .);

If your ids are already in a table -- or you can put them in one -- then use a table instead. You can still use IN

SELECT *
FROM `my_table`
WHERE `subset_id` IN (SELECT id FROM idtable);

Upvotes: 1

Related Questions