jkdba
jkdba

Reputation: 2509

MYSQL query a billion records and timeout issues

I have a MYSQL Database with one table that is growing by 5,000 rows a second. I expect this table to get into the Billions of records. I have a query function in php right now:

 $result =mysqli_query($con, "SELECT * FROM table WHERE  `data`='".$input."' LIMIT 0 , 30");

The problem is obvious the bigger the table gets the longer it takes to query the latest records. To the point, where the function just times out. If I run the same query from sql directly it does the same thing.

Is there a better way to query this table to speed up query times. Is it possible to query starting from the last record, first record and incrementally in between to speed this up?

Also I am not dead set in using php if there is a faster alternative please let me know.

Also would it be better to split the table into multiple tables and run the query at the same time on multiple smaller tables?

Upvotes: 1

Views: 4022

Answers (3)

Raymond Nijland
Raymond Nijland

Reputation: 11602

Indexes are one part of the story here..

Try to avoid LIMIT with a start and end. if you use LIMIT 1000000, 30 the mysql server needs to keep 1000030 records in memory so it's very resource wasting to throw away 1000000 records again to keep 30, especially when mysql should make a temp table on disk this wil happen when the memory table becomes to large...

Upvotes: 1

user1759572
user1759572

Reputation: 683

If you have such a big table and the values you are interested in are "latest" the you could improve performance using MySQL partitioning. More you can check this article.

Upvotes: 1

Björn3
Björn3

Reputation: 297

Indexes are very important here. Makes huge difference. Also make sure it is normalized correctly.

Another, more hands on idea, is to not have one big db. But separate it into many smaller, based on a criteria that you know when querying it later. Like, date, first letter, etc. It is not practical to have this big db growing all the time. It has to be divided.

Another division could be to store older data in slower bigger chunks. Saving newer used data in temp tables.

Upvotes: 3

Related Questions