Reputation: 136191
I have an architectural dilemma regarding JUnit
Test structure.
The flow of my test class test the loading of a dictionary from a large file. The class may fail due to lack of memory, file not found, wrong file structure, and so on.
On the first hand, it makes good sense to run the tests on a specific order: check for memory, then check if the file exists, then check for its structure. This would be very helpful, because if we don't have enough memory, the appropriate test would fail and give meaningful output, instead of the cryptic OutOfMemoryException
. Moreover, I won't have to overgo the time-consuming process of reading the file over and over again.
On the other hand, tests should be isolated, self contained entities.
Any ideas how to desing the test suite?
Upvotes: 2
Views: 1573
Reputation: 160191
To add to what Péter says:
An OOME unit test wouldn't rely on actually running out of memory. Instead whatever calls for the file load should react appropriately when the code-that-loads throws the OOME. This is what mocking/etc. are for--simulating behavior outside of what is being tested. (Same for missing files/file structure errors.)
IMO loading a file that's too big for the JVM configuration isn't really a useful thing to test in the code that actually loads the file--as long as whatever calls for the file load handles an OOME appropriately--at least not in a unit test.
If the code-that-loads needs some behavior verification when the file is too big I'd consider stubbing out the file load part and force an OOME rather than actually load a giant file, just to save time. To reiterate his point, actual physical behavior can be moved to integration tests and, say, run on a CI server outside of your own TDD heartbeat.
Upvotes: 1
Reputation: 116266
What you are describing is an integration test, rather than a unit test. Unit tests (ideally) should not rely on the file system, amount of memory available etc.
You should set up your unit tests with a manageable amount of data in the dictionary (preferably initialized e.g. from a string (stream) within the test fixture code, not from an external file). This way you can reinitialize your test fixture independently for each unit test case, and test whatever nitty gritty details you need to about the structure of the dictionary.
And of course, you should also create separate, high level integration tests to verify that your system as a whole works as expected under real life circumstances. If these are separated from the genuine unit tests, you can afford to run the (cheap and fast) unit tests via JUnit after any code change, and run the costlier integration tests via some other way only more rarely, depending on how long they take, e.g. once every night. You may also choose between setting up the integration environment once, and running your integration tests (via some other means than JUnit, e.g. a shell / batch script) in a well defined order, or reloading the dictionary file for each test case, as the extended execution time may not matter during a nightly build.
Upvotes: 3