Cédric Girard
Cédric Girard

Reputation: 3410

Unit test application including database is too slow

I do a lot of unit testing (mainly PHP/Mysql), but SQL scripts used to create the database are way too slow to run, so I waste a lot of time. I cannot always mock the database (legacy code, too complex to handle), so what I can do? Copy data Mysql files directly? Load my DB in another way?

Loading lot of data is quick, it's only the CREATE TABLES wich are slow.

Upvotes: 3

Views: 2481

Answers (6)

etov
etov

Reputation: 3032

You can use MySQL's MEMORY engine.

It's much, much faster than innoDB, and while it doesn't support all features (e.g. foreign keys), it's very useful for tests.

Assuming you have a script to set up your DB from scratch (e.g. which is run when tests are initialized), you can just prepend / replace the engine definitions from InnoDB to MEMORY.

You can even set the engine as a parameter to your test suite, so that you can run the fast MEMORY version with each save / commit, and the longer but more robust version with innoDB before releases.

Compared with other suggestions here:

  • It's cross platform (as opposed to moving the DB to memory via copying which only works for Linux)
  • Easier and faster than using a virtual server a-la Docker/VMWare

Upvotes: 1

Cédric Girard
Cédric Girard

Reputation: 3410

Reading another question give me an answer : just move my test database to memory. Under Debian/Ubuntu (my case), I have just to move the directory of my test DB to /dev/shm, create a link to have the old directory linked to the new one, restart mysql server, and tada!!

A suite wich take 140s to run now run in 10s. So obvious to do! This project use a lot the database during tests. Another one, wich run in 18/20s with a database on disk, does not run faster with a database in memory, but it have more unit tests than the other project, and less tables to create in integration testing.

Upvotes: 2

Steven
Steven

Reputation: 172646

You can use a initialized (test) database. This way you only have to:

  1. Start a transaction,
  2. Insert data specific to your tests into the database,
  3. Run your test
  4. Rollback the transaction.

When you do this, you will need to maintain a database migration script that allows you to update the test database when the schema changes, but you are probably already doing this with your current approach, and you would need to do this any way when rolling out a new version of your software.

Doing things like this will still be quite slow, but considerably faster than running all the DDL scripts for each test. If possible, go for writing unit tests, instead of integration tests, but you already said that this is hard considering the state of the system (legacy).

Upvotes: 1

Randy
Randy

Reputation: 16677

anohter option is to set up the database in a VMWare instance... then just reset that when needed.

Upvotes: 0

dj_segfault
dj_segfault

Reputation: 12419

I would:

  1. Create a test database the way I want it
  2. Do a mysqldump of the database with --add-drop-table and --no-autocommit so there's only one huge transaction
  3. import the dump whenever you want to reset the database

Upvotes: 0

Luixv
Luixv

Reputation: 8710

I've been in a similar situation. I've created a toy database only for testing. This DB has only few records.

Upvotes: 0

Related Questions