Reputation: 71
I'm working on an app that extract records from an Oracle database and then are exported as one single tabulated file.
However, when I attempt to read from the DB using JdbcPagingItemReader and write to a file I only get the number of records specified in pageSize. So if the pageSize is 10, then I get a file with 10 lines and the rest of the records seem to be ignored. So far, I haven't been able to find whats is really going on and any help would be most welcome.
Here is the JdbcPagingItemReader config:
<bean id="databaseItemReader"
class="org.springframework.batch.item.database.JdbcPagingItemReader" >
<property name="dataSource" ref="dataSourceTest" />
<property name="queryProvider">
<bean
class="org.springframework.batch.item.database.support.SqlPagingQueryProviderFactoryBean">
<property name="dataSource" ref="dataSourceTest" />
<property name="selectClause" value="SELECT *" />
<property name="fromClause" value="FROM *****" />
<property name="whereClause" value="where snapshot_id=:name" />
<property name="sortKey" value="snapshot_id" />
</bean>
</property>
<property name="parameterValues">
<map>
<entry key="name" value="18596" />
</map>
</property>
<property name="pageSize" value="100000" />
<property name="rowMapper">
<bean class="com.mkyong.ViewRowMapper" />
</property>
<bean id="itemWriter" class="org.springframework.batch.item.file.FlatFileItemWriter">
<!-- write to this csv file -->
<property name="resource" value="file:cvs/report.csv" />
<property name="shouldDeleteIfExists" value="true" />
<property name="lineAggregator">
<bean
class="org.springframework.batch.item.file.transform.DelimitedLineAggregator">
<property name="delimiter" value=";" />
<property name="fieldExtractor">
<bean class="org.springframework.batch.item.file.transform.BeanWrapperFieldExtractor">
<property name="names" value="ID" />
</bean>
</property>
</bean>
</property>
<job id="testJob" xmlns="http://www.springframework.org/schema/batch">
<step id="step1">
<tasklet>
<chunk reader="databaseItemReader" writer="itemWriter" commit-interval="1" />
</tasklet>
</step>
thanks
Upvotes: 2
Views: 6548
Reputation: 947
Adding @StepScope made my item reader take off with paging capability.
@Bean
@StepScope
ItemReader<Account> ItemReader(@Value("#{jobParameters[id]}") String id) {
JdbcPagingItemReader<Account> databaseReader = new JdbcPagingItemReader<>();
databaseReader.setDataSource(dataSource);
databaseReader.setPageSize(100);
databaseReader.setFetchSize(100);
PagingQueryProvider queryProvider = createQueryProvider(id);
databaseReader.setQueryProvider(queryProvider);
databaseReader.setRowMapper(new BeanPropertyRowMapper<>(Account.class));
return databaseReader;
}
Upvotes: 0
Reputation: 2749
Your setting seem incorrect for whereClause and sort key can not be same because pagesize works hand to hand with your sorting column name.
Check how is your data(in corresponding table) looks like.
In spring batch , as per your configuration, spring will create and execute as given below.. first query executed with pagesize = 10 , is like following
SELECT top 10 FROM tableName where snapshot_id=18596 snapshot_id > 10
Second /remaining query executed depends on your sort key.
SELECT * FROM tableName where snapshot_id=18596 snapshot_id > 10
SELECT * FROM tableName where snapshot_id=18596 snapshot_id > 20
and so on.. try running this query in database , doesn't it look weird . :-)
If you don't need where clause, remove it.
And if possible keep page size and commit-interval same, because that's how you decide to process and persist. But of course that depends on your design. So you decide.
Upvotes: 1
Reputation: 71
it was the scope="step" that was missing it should be:
<bean id="databaseItemReader"
class="org.springframework.batch.item.database.JdbcPagingItemReader" scope="step">
Upvotes: 2