Reputation: 43
I am building a spring batch job that will be invoked through a webservice. The webservice will take a list of select and delete statement pairs. The records returned by the select statement will be saved as a CSV on the filesystem and then those same records will be deleted by executing the supplied delete statement.
I have seen a number of ColumnRowMapper examples but that requires me to create a POJO for each table entity. I am looking for a solution that will handle any column from any table. Any suggestions on approach?
****UPDATE**** Since writing this post, I've landed on the following solution.
@Bean
@StepScope
public JdbcCursorItemReader<Map<String, ?>> getRowsOfDataForExportFromTable(){
JdbcCursorItemReader<Map<String, ? extends Object>> databaseReader = new JdbcCursorItemReader<>();
databaseReader.setDataSource(jdbcTemplate.getDataSource());
databaseReader.setSql("select * from SOME_TABLE where last_updated_date < DATE_SUB(NOW(), INTERVAL 10 DAY);");
databaseReader.setRowMapper(new RowMapper<Map<String, ? extends Object>>() {
@Override
public Map<String, ? extends Object> mapRow(ResultSet resultSet, int i) throws SQLException {
Map<String,String> resultMap = new LinkedHashMap<>();
int numOfColumns = resultSet.getMetaData().getColumnCount();
for (int j = 1; j < numOfColumns+1; j++){
String columnName = resultSet.getMetaData().getColumnName(j);
String value = resultSet.getString(j);
resultMap.put(columnName,value);
}
return resultMap;
}
});
return databaseReader;
}
The above ItemReader will build a LinkedHashMap row mapper where the column name is the key and the column value is the value.
Upvotes: 2
Views: 797
Reputation: 961
Did you try to use Map
instead of POJO
? You can dynamically fill it in Reader
, and then create CSV
file from this Map
.
Upvotes: 1