Abraham Arnold
Abraham Arnold

Reputation: 365

Spring Data JPA not saving to database when using Spring Batch

I have a Spring Boot application using JPA that has 1 PostgreSQL database. And I am using Spring Batch. The scenario is I am reading a file and writing data to the PostgreSQL database. The program works with PostgreSQL when it is creating meta-data tables used by Spring Batch in the database. But what I need is Spring Boot not to create meta-data tables and use in-memory Map-based job repository by Spring Batch. I don't want to create meta-data tables in the database at all. Need to just execute in-memory Map-based. I tried many answers but none of them worked. And I see,

MapJobRepositoryFactoryBean is deprecated

Here is my BatchConfiguration.java class,

@Configuration
@EnableBatchProcessing
public class BatchConfiguration {

    private @Autowired JobBuilderFactory jobBuilderFactory;
    private @Autowired StepBuilderFactory stepBuilderFactory;
    private @Autowired UserItemReader userItemReader;
    private @Autowired UserItemProcessor userItemProcessor;
    private @Autowired UserItemWriter userItemWriter;

    @Bean
    public Step importUsersStep() {
        return stepBuilderFactory.get("STEP-01")
                .<User, User>chunk(10)
                .reader(userItemReader)
                .processor(userItemProcessor)
                .writer(userItemWriter)
                .build();
    }

    @Bean
    public Job importUsersJob() {
        return jobBuilderFactory.get("JOB-IMPORT")
                .flow(importUsersStep())
                .end()
                .build();
    }
}

Repository.java class,

public interface UserRepository extends JpaRepository<User, Long> {

}

And in UserItemWriter.java in write method I am calling userRepository.saveAll();

So how can I make this to work with in-memory Map-based and also since I am using Spring Data JPA to save the user to the database so the save should also work without any issue because when I am trying some approaches I saw nothing committed to PostgreSQL and I think even the user data also committed to in-memory Map. So anybody can help me? Thanks in advance.

Upvotes: 1

Views: 3868

Answers (2)

Shawrup
Shawrup

Reputation: 2734

We solve this problem by defining multiple datasources. One in memory datasource for Spring Batch Metadatas and another or more datasources for jobs depending on configuration.

# for spring batch metadata
spring.datasource.url=jdbc:h2:mem:testdb
spring.datasource.driverClassName=org.h2.Driver
spring.datasource.username=sa
spring.datasource.password=password

# for jobs
job.datasource.url=jdbc:postgresql://localhost:5432/postgres
job.datasource.username=user
job.datasource.password=password

We create datasource manually like

    @Bean
    @Primary
    @ConfigurationProperties("spring.datasource")
    public DataSourceProperties mysqlDataSourceProperties() {
        return new DataSourceProperties();
    }

and another one.

@Configuration
@EnableJpaRepositories(basePackages = "your.package.name",
        entityManagerFactoryRef = "postgresEntityManagerFactory",
        transactionManagerRef= "postgresTransactionManager")
public class PostgresDataConfig {

    @Bean
    @ConfigurationProperties("postgres.datasource")
    public DataSourceProperties postgresDataSourceProperties() {
        return new DataSourceProperties();
    }

    @Bean(name = "postgresEntityManagerFactory")
    public LocalContainerEntityManagerFactoryBean postgresEntityManagerFactory(
            EntityManagerFactoryBuilder builder) {
        return builder
                .dataSource(postgresDataSource())
                .packages("com.bimurto.multipledatasource.postgres")
                .build();
    }

    @Bean(name = "postgresTransactionManager")
    public PlatformTransactionManager postgresTransactionManager(
            final @Qualifier("postgresEntityManagerFactory") LocalContainerEntityManagerFactoryBean entityManagerFactory) {
        return new JpaTransactionManager(Objects.requireNonNull(entityManagerFactory.getObject()));
    }
}

This allows to repositories use another datasource than the default one.

Upvotes: 2

Sergi Almar
Sergi Almar

Reputation: 8404

MapJobRepositoryFactoryBean was only recommended for testing and development purposes, not for production use, it has a number of issues, that's why it was deprecated.

Spring Batch will try to use the DataSource from your ApplicationContext if you have one. If you don't want Spring Batch to use it to store its state, you could use a JobRepositoryFactoryBean with an embedded database. The easiest way to do so, is to define a BatchConfigurer bean:

@Bean
public BatchConfigurer configurer() {
    return new DefaultBatchConfigurer(dataSource());
}

public DataSource dataSource(){
    return new EmbeddedDatabaseBuilder()
            .setType(EmbeddedDatabaseType.HSQL)
            .addScript("classpath:org/springframework/batch/core/schema-drop-hsqldb.sql")
            .addScript("classpath:org/springframework/batch/core/schema-hsqldb.sql")
            .build();
}

Upvotes: 1

Related Questions