Reputation: 43
I want to know how it is affected in query performance to use limit syntax There are 180000 rows. So I would select data using limit syntax. So I want to know the relationship between limit syntax and query performance.
Upvotes: 2
Views: 602
Reputation: 5631
The easiest way to quickly gauge the impact of any SQL commands is using EXPLAIN
and checking the number of rows etc.. Starting with EXPLAIN
ing the query is usually a good route. Just put 'EXPLAIN ' before your query to find out how expensive it is.
Sending large data sets might not be the problem -- it 's the sorting
In my experience it's the usage of ORDER BY
which is extremely slow. Frequently LIMIT
and ORDER BY
are used together for obvious reasons, and it's the sorting that's the slow part, rather than LIMIT
itself.
LIMIT
does avoid returning large data sets which might otherwise be discarded, but usually in practice the database is on the same local network or even the same server. Network performance & large datasets usually aren't the limiting factor. Instead, it's the cost of sorting the data before returning it which is highest. You can get the most benefit by optimising for those sorts, eg by creating indexes matching your queries.
Upvotes: 1
Reputation: 108651
LIMIT
usually saves part of the cost of sending large result sets from the MySQL server to the requesting client. It's good to use LIMIT
if you need only a few result set rows, rather than simply skipping un-needed rows on the client side.
There's a notorious performance antipattern using LIMIT
. A query like this
SELECT a,whole,mess,of,columns,...
FROM big_table JOIN big_tableb ON something JOIN big_tablec ON something ....
ORDER BY whole, mess DESC
LIMIT 5
in MySQL wastes server resources (time and RAM). Why? It generates a big result set, then sorts it, then discards all but a few rows.
Another performance antipattern is LIMIT small_number, big_number
applied to a complex result set. It has to romp through many rows to get a small number of rows.
You can work around these with a deferred join pattern, something like this:
SELECT a,whole,mess,of,columns,..
FROM (
SELECT big_table_id
FROM big_table JOIN big_tableb ON something JOIN big_tablec ON something ....
ORDER BY whole, mess DESC
LIMIT 5, 200000
) ids,
JOIN big_table ON ids.big_table_id = big_table.big_table_id
JOIN big_tableb ON something JOIN big_tablec ON something ...
This pattern orders and then discard just some id
values rather than a whole mess of columns.
Using LIMIT
really helps performance in situations where the result set is ordered via an index. For example, if you have an index on datestamp
and you do
SELECT datestamp, col, col
FROM table
ORDER BY datestamp DESC
LIMIT 20
the MySQL query planner can scan backwards through the datestamp
index and retrieve just twenty rows.
Upvotes: 3