Reputation: 3689
I'm want my Spring batch application to read 50 record from the database at one time and then send those 50 records to the processor and then the writer.
Can someone please tell me how this can be done.
I've tried using JdbcPagingItemReader and setting the pageSize to 50 which reads 50 records but the rowMapper, processor and writer receives one record at a time instead of getting the 50 records.
How do I make it so that the processor and writer get the 50 record in a dto instead of receiving one record at a time?
xml spring config
<job id="indexJob" job-repository="jobRepository">
<step id="job1">
<tasklet transaction-manager="transactionManager">
<chunk reader="reader" processor="processor" writer="writer" commit-interval="1"/>
</tasklet>
</step>
</job>
Java spring Configuration
@Bean
@Scope("step")
public JdbcPagingItemReader reader() throws Exception {
MySqlPagingQueryProvider provider = new MySqlPagingQueryProvider();
provider.setSelectClause("select id");
provider.setFromClause("from BATCH_CUSTOMER");
provider.setSortKey("id");
JdbcPagingItemReader reader = new JdbcPagingItemReader();
reader.setDataSource(this.dataSource());
reader.setQueryProvider(provider);
reader.setPageSize(50);
reader.setRowMapper(new MyRowMapper());
reader.afterPropertiesSet();
int counter = 0;
ExecutionContext executionContext = new ExecutionContext();
reader.open(executionContext);
Object pageCredit = new Object();
while (pageCredit != null) {
pageCredit = reader.read();
System.out.println("pageCredit:" + pageCredit);
counter++;
}
reader.close();
return reader;
}
Upvotes: 3
Views: 10120
Reputation: 3689
Answered in Spring batch forum
It is all by design yuo want to process each record. The ItemWriter gets as many records as you want but is bound by the commit-interval. Yours is 1 which means each record is individually committed, I suggest you set it to 50. The processor processes each record by it self until the commit interval is reached then the writer is called. As mentioned yours is 1.
Upvotes: 0