Thomas Escolan
Thomas Escolan

Reputation: 1529

Unmarshalling with Jackson "The Json input stream must start with an array of Json objects"

I'm getting an error when unmarshalling files that only contain a single JSON object: "IllegalStateException: The Json input stream must start with an array of Json objects" I can't find any workaround and I don't understand why it has to be so.

@Bean
public ItemReader<JsonHar> reader(@Value("file:${json.resources.path}/*.json") Resource[] resources) {
    log.info("Processing JSON resources: {}", Arrays.toString(resources));
    JsonItemReader<JsonHar> delegate = new JsonItemReaderBuilder<JsonHar>()
            .jsonObjectReader(new JacksonJsonObjectReader<>(JsonHar.class))
            .resource(resources[0])  //FIXME had to force this, but fails anyway because the file is "{...}" and not "[...]"
            .name("jsonItemReader")
            .build();
    MultiResourceItemReader<JsonHar> reader = new MultiResourceItemReader<>();
    reader.setDelegate(delegate);
    reader.setResources(resources);
    return reader;
}

I need a way to unmarshall single object files, what's the point in forcing arrays (which I won't have in my use case)??

Upvotes: 1

Views: 1857

Answers (3)

Thomas Escolan
Thomas Escolan

Reputation: 1529

Though this may not be ideal, this is how I handled the situation:

@Bean
public ItemReader<JsonHar> reader(@Value("file:${json.resources.path}/*.json") Resource[] resources) {
    log.info("Processing JSON resources: {}", Arrays.toString(resources));
    JsonItemReader<JsonHar> delegate = new JsonItemReaderBuilder<JsonHar>()
            .jsonObjectReader(new JacksonJsonObjectReader<>(JsonHar.class))
            .resource(resources[0]) //DEBUG had to force this because of NPE...
            .name("jsonItemReader")
            .build();
    MultiResourceItemReader<JsonHar> reader = new MultiResourceItemReader<>();
    reader.setDelegate(delegate);
    reader.setResources(Arrays.stream(resources)
            .map(WrappedResource::new) // forcing the bride to look good enough
            .toArray(Resource[]::new));
    return reader;
}
@RequiredArgsConstructor
static class WrappedResource implements Resource {
    @Delegate(excludes = InputStreamSource.class)
    private final Resource resource;
    @Override
    public InputStream getInputStream() throws IOException {
        log.info("Wrapping resource: {}", resource.getFilename());
        InputStream in = resource.getInputStream();
        BufferedReader reader = new BufferedReader(new InputStreamReader(in, UTF_8));
        String wrap = reader.lines().collect(Collectors.joining())
                .replaceAll("[^\\x00-\\xFF]", "");  // strips off all non-ASCII characters
        return new ByteArrayInputStream(("[" + wrap + "]").getBytes(UTF_8));
    }
}

Upvotes: 0

Ismael Sarmento
Ismael Sarmento

Reputation: 894

Definitely not ideal @thomas-escolan. As @mahmoud-ben-hassine pointed, ideal would be to code a custom reader.

In case some new SOF users stumble on this question, I leave here a code example on how to do it

Upvotes: 1

Mahmoud Ben Hassine
Mahmoud Ben Hassine

Reputation: 31620

I don't understand why it has to be so.

The JsonItemReader is designed to read an array of objects because batch processing is usually about handling data sources with a lot of items, not a single item.

I can't find any workaround

JsonObjectReader is what you are looking for: You can implement it to read a single json object and use it with the JsonItemReader (either at construction time or using the setter). This is not a workaround but a strategy interface designed for specific use cases like yours.

Upvotes: 2

Related Questions