de.la.ru
de.la.ru

Reputation: 3144

JUnit 5 - Convert multiple CSV file parameters into single object

Hello there Java testers. I started working with JUnit 5 a few days ago because I loved the new way of creating parameterized tests.

There's the @ParameterizedTest that allows a test to run in a parameterized fashion and @CsvFileSource that loads the parameters from a CSV file.

The thing is that I have too much columns in my CSV and don't want to have a huge method signature in my unit test. Let me give you an example:

@ParameterizedTest
@CsvFileSource(resources = "/test-data.csv")
void myTest(String p1, String p2, String p3, String p4, String p5, String p6) {
    //test using parameters
}

I'd like to know if there's some kind of converter that can do something like this for me:

@ParameterizedTest
@CsvFileSource(resources = "/test-data.csv")
void myTest(@ConvertWith(TestDataConverter.class) TestData testData) {
    //test using parameters
}

static class TestDataConverter implements TestConverter{
    public TestData convert(Object ... params){
       //a simple method that creates the TestData object and inserts the params in it
    }
}

And that's it.

Upvotes: 4

Views: 5980

Answers (3)

Sam Brannen
Sam Brannen

Reputation: 31247

Updated Answer

As of JUnit Jupiter 5.2: Yes, there are dedicated ArgumentsAggregator and ArgumentsAccessor APIs exactly for this purpose.

Take a look at the PersonAggregator example in the JUnit User Guide for a concrete example.

Original Answer

As of JUnit Jupiter 5.0: No, there is currently no out-of-the-box converter that could do that for you.

Reason: The parameterized test support in JUnit Jupiter does not support mapping from multiple arguments to a single argument. So, as @Sormuras suggested, you could open an issue to recommend that.

In terms of performing the actual conversion, you could consider using uniVocity-parsers, which JUnit Jupiter uses internally to parse CSV files. uniVocity-parsers also has support for mapping directly from CSV files onto beans, but to use that you'd need to implement your own @TestTemplate that reads the CSV files and performs the mappings.

Other options include the CSV support from the Jackson framework or from the JSefa project.

Upvotes: 7

Cau No
Cau No

Reputation: 1

You can use this Aggregator, albeit somewhat hacky:

public class CsvToMapAggregator implements ArgumentsAggregator {

    @Retention(RetentionPolicy.RUNTIME)
    @Target(ElementType.PARAMETER)
    @AggregateWith(CsvToMapAggregator.class)
    public @interface CsvToMap {}

    private static String[] keys = null; // ATTN: Do not use in parallel runs !!!

    @Override public Map<String, Object> aggregateArguments(ArgumentsAccessor accessor, ParameterContext context) throws ArgumentsAggregationException {
        Object[] values = accessor.toArray();
        if (keys == null) {
            keys = (String[]) values; // First Line: Header
        } else if (values.length != keys.length) {
            keys = null; // Last Line: End of Data
            return null;
        }
        Map<String, Object> map = new HashMap<>();
        for (int i = 0; i < values.length; i++) {
            map.put(keys[i], values[i]); // Read Data from Line
        }
        return map;
    }
}

Note: All lines must have the same number of items.

The csv file must end with an additional line (e.g."EOF" in first column) that differs in that number from the previous lines.

The test method will look like this, to skip the header and end line:

@ParameterizedTest(name = "{0}")
@CsvFileSource(resources = "/testcases.csv", delimiter = ';')
@DisplayName("read complete .csv and map columns")
void testCsvData(@CsvToMap Map<String, Object> map, TestInfo info) {
    switch (info.getDisplayName()) {
        case "Testcase": // First Line: Header
        case "EOF": // Last Line: End of File
            break;
        default:
            assertNotNull(map);
            assertFalse(map.isEmpty());
}

Upvotes: 0

Carlo Bellettini
Carlo Bellettini

Reputation: 1180

Not a general solution, but one little trick that sometimes could work:

  • enclose each row of the CSV file between quotes... so that each row is considered as just ONE string (sed, awk or any similar tool are your friends)
  • now implement the TestConverter in order to transform the single string into a String[] to be used as you prefer. In many cases it could be as simple as:

    static class TestDataConverter implements TestConverter{
      public TestData convert(String string){
    
      String[] params = string.split(",");
    
      //a simple method that creates the TestData object and inserts the params in it
      }
    }
    

UPDATE:

A very common case that does not permit a direct use of the trick is when quotes are already used in the CSV in order to deal with 'comma' inside a field:

Name, complex field, City
Carlo, "this field could contain , and other critical chars" , London
Mario, "this field could contain , and other critical chars" , New York
Luca, "this field could contain , and other critical chars" , Milan

Quoting lines with the same quotes clearly does not work.

This or similar cases could however be solved with some additional (but boring) transformation of the CSV file.

For example assuming that 3 consecutive sharps "###” can be considered as a consistent separator:

"Name###complex field###City"
"Carlo###this field could contain , and other critical chars###London"
"Mario###this field could contain , and other critical chars###New York"
"Luca###this field could contain , and other critical chars###Milan"

I knew... it is a trick.. not a definitive and elegant solution... :-D

clearly in this case the split will be on the 3 hashes:

String[] params = string.split("###");

Upvotes: 2

Related Questions