dazzle
dazzle

Reputation: 1346

Spring batch job to update different tables

I am reading the article http://spring.io/guides/gs/batch-processing/ which explains reading a csv and writing it back to a DB. I want to know how can I read mutiple CSV files say A.csv, B.csv etc and write the content back in respective tables table_A, table_B etc. Please note the content of each csv file should go in a different table.

Upvotes: 1

Views: 2820

Answers (1)

Thrax
Thrax

Reputation: 1964

The basic use case here would be to create as much steps as you have CSV files (since there is no default MultiResourceItemReader implementation).

Each of your step would read a CSV (with a FlatFileItemReader) and write to your database (using JdbcBatchItemWriter or another one of the same kind). Although you will have multiple steps, if your CSV files have the same format (columns, separators), you can factorize the configuration using an AbstractStep. See documentation : http://docs.spring.io/spring-batch/trunk/reference/html/configureStep.html

If not, then you can at least share the common attributes such as LineMapper, ItemPreparedStatementSetter and DataSource.


UPDATE

Here are examples for your readers and writers :

    <bean id="reader" class="org.springframework.batch.item.file.FlatFileItemReader">
        <property name="resource" value="yourFile.csv" />
        <property name="lineMapper">
            <bean class="org.springframework.batch.item.file.mapping.DefaultLineMapper">
                <property name="lineTokenizer">
                    <bean class="org.springframework.batch.item.file.transform.DelimitedLineTokenizer">
                        <property name="names" value="column1,column2,column3..." />
                    </bean>
                </property>
                <property name="fieldSetMapper">
                    <bean class="org.springframework.batch.item.file.mapping.BeanWrapperFieldSetMapper">
                        <property name="prototypeBeanName" value="yourBeanClass" />
                    </bean>
                </property>
            </bean>
        </property>
    </bean>

    <bean id="writer" class="org.springframework.batch.item.database.JdbcBatchItemWriter">
        <property name="dataSource" ref="dataSource" />
        <property name="sql">
            <value>
                <![CDATA[        
                    insert into YOUR_TABLE(column1,column2,column3...) 
                    values (:beanField1, :beanField2, :beanField3...)
                ]]>
            </value>
        </property>
        <property name="itemSqlParameterSourceProvider">
            <bean class="org.springframework.batch.item.database.BeanPropertyItemSqlParameterSourceProvider" />
        </property>
    </bean>

UPDATE 2

Here's an example to chain the steps in the job (with Java-based configuration) :

@Bean
public Job job() {
     return jobBuilderFactory().get("job").incrementer(new RunIdIncrementer())
     .start(step1()).next(step2()).build();
}

Upvotes: 2

Related Questions