Reputation: 1481
I'm working on an API written in NodeJS and connecting to a MySQL 5.5 database. On our live system, we have one query that's running at over 7 seconds. We've traced this down to a couple of tables having over 5 millions records in and incorrectly indexed.
On our dev areas, we don't have that number of records. I think I've identified the query that's causing the problem but, short of putting 5 millions records on my laptop (which will take ages to generate), I cannot prove it.
Is there a setting/technique that I can use to emulate having millions of database records without actually having the records in?
Upvotes: 2
Views: 107
Reputation: 1481
My original thought was of get a lot of data into the databases to highlight the issue. I managed to get a million records in, just by leaving a script hammering the (dev) API overnight on.
The EXPLAIN function was pretty helpful. I also found the EXPLAIN EXTENDED one pretty useful too. However, neither of these highlighted the problem as it wasn't an indexing issue.
I also found the "RESET QUERY CACHE" function useful, as this cleared down the cached data for debugging purposes.
Some numpty had put a "WHERE DATE_FORMAT() = ''" in there. Although quite annoyed at how long it took me to notice it (it was a Friday afternoon - be kind), I found and fixed the issue. Thanks all.
The moral of the story is: NEVER STICK FUNCTIONS IN WHERE CLAUSES!!!
Upvotes: 0
Reputation: 116028
You can write Perl or Python script to populate your big test tables. If you do it as single transaction, it should not take very long time.
If MySQL had generate_series()
supported by PostgreSQL, it would have been much easier to do without scripting, but unfortunately, it doesn't :(.
However, you still can easily create big tables in MySQL (or any other SQL database) without scripting. Main idea is to use INSERT INTO ... SELECT
like this:
CREATE TABLE mytable (
id INTEGER PRIMARY KEY AUTO_INCREMENT,
name VARCHAR(16)
);
INSERT INTO mytable(name) VALUES ('Blah');
INSERT INTO mytable(name) SELECT name FROM mytable;
INSERT INTO mytable(name) SELECT name FROM mytable;
...
Note that each next INSERT
doubles table size.
Using this approach, this SQLFiddle creates 1 million test rows using about 20 lines of SQL in less than 5 seconds.
Upvotes: 2
Reputation: 73
In Oracle, you can load in table statistics to from other databases - ie production to dev - in effect, mock the data volumes used when doing things like explain plans. Might be worth seeing if something equivalent exists under MySQL. It's not perfect, but would give you a better idea of what's going on under the hood.
Sorry I can't be more specific, I've just woken up and I haven't had tea yet.
Upvotes: 0