Reputation: 835
We are using Spring Batch to do some processing, reading some ids via Reader and we want to process them as 'chunks' via a processor then write to multiple files. But the processor interface allows only one item to processed at a time, we need to do bulk processing because the processor depends on a third party and calling the service for each item is not an option.
I saw that we can create Wrappers for all Reader-Processor-Writers involved in the 'chunk' to handle List<> and delegate to some concrete reader/processor/writer. but that doesn't seem all nice to me. Like this:
<batch:chunk reader="wrappedReader" processor="wrappedProcessor" writer="wrappedWriter"
commit-interval="2"/>
Is there a 'chunking' option that lets chunking before the processor? rather than before Writer.
Cheers,
Upvotes: 6
Views: 5025
Reputation: 21493
I'd recommend reconfiguring your ItemReader
to return the "chunk" that needs to be processed since that's really the "item" you're processing is.
Upvotes: 3
Reputation: 1964
Since modifying the whole Spring Batch mechanic to process multiple items at once seems really complicated, I suggest you move your third party processing to a writer which actually can process a chunk of item at once.
Since you'll obviously need to maintain your current writer, you could simply use a CompositeItemWriter
with 2 (or more) delegates : your new custom ItemWriter
and your current one. The order of definition matters, since it will be the order in which they'll be called.
UPDATE
Since using 2 distinct ItemWriter
inside a CompositeItemWriter
doesn't let you keep the modifications of the first one in the second one, you could also use an ItemWriteListener
.
By implementing the method beforeWrite
, you can call your third party with the actual chunk right before it's written :
@Override
public void beforeWrite(List<? extends T> items) {
//Third party call on item chunk
}
Upvotes: 2