Reputation: 6787
Technology: Junit latest version Application is business oriented
Some people use hard coded data for test case some use properties files and some xml files. As per my knowledge xml is better than other two. Is there some better approach in use in industry. Please suggest best practice to develop test cases.
Upvotes: 3
Views: 1287
Reputation: 1477
As long as your data files are in the classpath I don't see a problem. One of the unit testing best practices is that a test case should be isolated and not depend on external resources. If you use external files ( xml, csv, etc ) you will affect your test speed for sure.
If the amount of data is not big, Using parameterized tests can be a good fit. Yes it is hardcoded but in a clean way:
@ParameterizedTest
@MethodSource("provideStringsForLengthTest")
void testStringLength(String input, int expectedLength) {
StringUtilsTest utils = new StringUtilsTest();
assertEquals(expectedLength, utils.getStringLength(input));
}
// Method providing the test data
private static Stream<Object[]> provideStringsForLengthTest() {
return Stream.of(
new Object[]{ "hello", 5 },
new Object[]{ "JUnit", 5 },
new Object[]{ "parameterized", 13 }
);
}
This tutorial about Unit testing best practices talks about unit tests isolation and other best practices.
Upvotes: 0
Reputation: 114767
I'd try to keep the tests fast and simple. The faster, the tests run, the more tests you can add to your build.
The disadvantage of xml: parsing is quite expensive, reading the values from the DOM too. For tabular data, I'd use flat files in some sort of CSV format. For key/value data, a simple properties file is absolut sufficient.
With JUnit, we're on unit testing level, we want to know if the public interface are implemented according to the specs and if they behave in a defined way for all possible input. Therefore I usually hardcode the test values in the test methods, because they usually don't change (no need to edit the values outside of the test classes)
Upvotes: 2
Reputation: 40894
It is important that the mapping between data representation in the test and data passed to the function being tested is as transparent as possible. Hard-coded data is totally ok if the data are few and easy to observe right in the source. The fewer windows you need to keep open to understand a test case, the better.
XML is best for nested, tree-like data, but it's a bit wordy. YAML may also be good for this. For flat data, properties and just line-organized files are ok.
There's no single format which is superior to all others in all ways. Choose the easiest and most natural for a particular test suite / subject area. Investing some effort in handling the most natural format does pay when you need to quickly produce more and more test cases, and then again when you investigate a regression. E.g in our project (quite large) we had to invent several data representations and write (simple) custom parsers just to make writing and reading data for test cases a breeze.
Upvotes: 3
Reputation: 718788
I don't think there are best practices. I suggest that you use the one that works best for your particular problem space, and the kind of testing you need to perform. If the tests you need to code essentially involve calling methods with a large number of different inputs, then a data driven approach (using properties files, XML, or something else) is a good idea. If not, it is a bad idea.
The one thing to watch is spending too much time creating complicated infrastructure so that you can code your tests neatly.
Upvotes: 2