Reputation: 635
Background
I have a Spring Batch job where :
FlatFileItemReader
- Reads one row at a time from the fileItemProcesor
- Transforms the row from the file into a List<MyObject>
and returns the List
. That is, each row in the file is broken down into a List<MyObject>
(1 row in file transformed to many output rows).ItemWriter
- Writes the List<MyObject>
to a database table. (I used this
implementation to unpack the list received from the processor
and delegae to a JdbcBatchItemWriter
)Question
List
of 100000 MyObject
instances.JdbcBatchItemWriter
will end up writing the entire List
with 100000 objects to the database. My question is : The JdbcBatchItemWriter
does not allow a custom batch size. For all practical purposes, the batch-size = commit-interval for the step. With this in mind, is there another implementation of an ItemWriter
available in Spring Batch that allows writing to the database and allows configurable batch size? If not, how do go about writing a custom writer myself to acheive this?
Upvotes: 2
Views: 5467
Reputation: 3868
I wouldn't do this. It presents issues for restartability. Instead, modify your reader to produce individual items rather than having your processor take in an object and return a list.
Upvotes: 0
Reputation: 635
The answer from Mahmoud Ben Hassine and the comments pretty much covers all aspects of the solution and is the accepted answer.
Here is the implementation I used if anyone is interested :
public class JdbcCustomBatchSizeItemWriter<W> extends JdbcDaoSupport implements ItemWriter<W> {
private int batchSize;
private ParameterizedPreparedStatementSetter<W> preparedStatementSetter;
private String sqlFileLocation;
private String sql;
public void initReader() {
this.setSql(FileUtilties.getFileContent(sqlFileLocation));
}
public void write(List<? extends W> arg0) throws Exception {
getJdbcTemplate().batchUpdate(sql, Collections.unmodifiableList(arg0), batchSize, preparedStatementSetter);
}
public void setBatchSize(int batchSize) {
this.batchSize = batchSize;
}
public void setPreparedStatementSetter(ParameterizedPreparedStatementSetter<W> preparedStatementSetter) {
this.preparedStatementSetter = preparedStatementSetter;
}
public void setSqlFileLocation(String sqlFileLocation) {
this.sqlFileLocation = sqlFileLocation;
}
public void setSql(String sql) {
this.sql = sql;
}
}
Note :
Collections.unmodifiableList
prevents the need for any explicit casting.sqlFileLocation
to specify an external file that contains the sql and FileUtilities.getfileContents
simply returns the contents of this sql file. This can be skipped and one can directly pass the sql
to the class as well while creating the bean.Upvotes: 1
Reputation: 31600
I see no obvious way to set the batch size on the JdbcBatchItemWriter
. However, you can extend the writer and use a custom BatchPreparedStatementSetter
to specify the batch size. Here is a quick example:
public class MyCustomWriter<T> extends JdbcBatchItemWriter<T> {
@Override
public void write(List<? extends T> items) throws Exception {
namedParameterJdbcTemplate.getJdbcOperations().batchUpdate("your sql", new BatchPreparedStatementSetter() {
@Override
public void setValues(PreparedStatement ps, int i) throws SQLException {
// set values on your sql
}
@Override
public int getBatchSize() {
return items.size(); // or any other value you want
}
});
}
}
The StagingItemWriter in the samples is an example of how to use a custom BatchPreparedStatementSetter
as well.
Upvotes: 2