PainIsAMaster
PainIsAMaster

Reputation: 2076

Spring Batch/Data JPA application not persisting/saving data to Postgres database when calling JPA repository (save, saveAll) methods

I am near wits-end. I read/googled endlessly so far and tried the solutions on all the google/stackoverflow posts that have this similiar issue (there a quite a few). Some seemed promising, but nothing has worked for me yet; though I have made some progress and I am on the right track I believe (I'm believing at this point its something with the Transaction manager and some possible conflict with Spring Batch vs. Spring Data JPA).

References:

  1. Spring boot repository does not save to the DB if called from scheduled job
  2. JpaItemWriter: no transaction is in progress

Similar to the aforementioned posts, I have a Spring Boot application that is using Spring Batch and Spring Data JPA. It reads comma delimited data from a .csv file, then does some processing/transformation, and attempts to persist/save to database using the JPA Repository methods, specifically here .saveAll() (I also tried .save() method and this did the same thing), since I'm saving a List<MyUserDefinedDataType> of a user-defined data type (batch insert).

Now, my code was working fine on Spring Boot starter 1.5.9.RELEASE, but I recently attempted to upgrade to 2.X.X, which I found, after countless hours of debugging, only version 2.2.0.RELEASE would persist/save data to database. So an upgrade to >= 2.2.1.RELEASE breaks persistence. Everything is read fine from the .csv, its just when the first time the code flow hits a JPA repository method like .save() .saveAll(), the application keeps running but nothing gets persisted. I also noticed the Hikari pool logs "active=1 idle=4", but when I looked at the same log when on version 1.5.9.RELEASE, it says active=0 idle=5 immediately after persisting the data, so the application is definitely hanging. I went into the debugger and even saw after jumping into the Repository calls, it goes into almost an infinite cycle through the Spring AOP libraries and such (all third party) and I don't believe ever comes back to the real application/business logic that I wrote.

3c22fb53ed64 2021-05-20 23:53:43.909 DEBUG
                    [HikariPool-1 housekeeper] com.zaxxer.hikari.pool.HikariPool - HikariPool-1 - Pool stats (total=5, active=1, idle=4, waiting=0)

Anyway, I tried the most common solutions that worked for other people which were:

  1. Defining a JpaTransactionManager @Bean and injecting it into the Step function, while keeping the JobRepository using the PlatformTransactionManager. This did not work. Then I also I tried using the JpaTransactionManager also in the JobRepository @Bean, this also did not work.
  2. Defining a @RestController endpoint in my application to manually trigger this Job, instead of doing it manually from my main Application.java class. (I talk about this more below). And per one of the posts I posted above, the data persisted correctly to the database even on spring >= 2.2.1, which further I suspect now something with the Spring Batch persistence/entity/transaction managers is messed up.

The code is basically this: BatchConfiguration.java

@Configuration
@EnableBatchProcessing
@Import({DatabaseConfiguration.class})
public class BatchConfiguration {

// Datasource is a Postgres DB defined in separate IntelliJ project that I add to my pom.xml
DataSource dataSource;

@Autowired
public BatchConfiguration(@Qualifier("dataSource") DataSource dataSource) {
    this.dataSource = dataSource;
}

@Bean
@Primary
public JpaTransactionManager jpaTransactionManager() {
    final JpaTransactionManager tm = new JpaTransactionManager();
    tm.setDataSource(dataSource);
    return tm;
}


 @Bean
 public JobRepository jobRepository(PlatformTransactionManager transactionManager) throws Exception {
    JobRepositoryFactoryBean jobRepositoryFactoryBean = new JobRepositoryFactoryBean();
    jobRepositoryFactoryBean.setDataSource(dataSource);
    jobRepositoryFactoryBean.setTransactionManager(transactionManager);
    jobRepositoryFactoryBean.setDatabaseType("POSTGRES");
    return jobRepositoryFactoryBean.getObject();
}

@Bean
public JobLauncher jobLauncher(JobRepository jobRepository) {
    SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
    simpleJobLauncher.setJobRepository(jobRepository);
    return simpleJobLauncher;
}

@Bean(name = "jobToLoadTheData")
 public Job jobToLoadTheData() {
    return jobBuilderFactory.get("jobToLoadTheData")
            .start(stepToLoadData())
            .listener(new CustomJobListener())
            .build();
}

@Bean
@StepScope
public TaskExecutor taskExecutor() {
    ThreadPoolTaskExecutor threadPoolTaskExecutor = new ThreadPoolTaskExecutor();
    threadPoolTaskExecutor.setCorePoolSize(maxThreads);
    threadPoolTaskExecutor.setThreadGroupName("taskExecutor-batch");
    return threadPoolTaskExecutor;
}

@Bean(name = "stepToLoadData")
public Step stepToLoadData() {
    TaskletStep step = stepBuilderFactory.get("stepToLoadData")
            .transactionManager(jpaTransactionManager())
            .<List<FieldSet>, List<myCustomPayloadRecord>>chunk(chunkSize)
            .reader(myCustomFileItemReader(OVERRIDDEN_BY_EXPRESSION))
            .processor(myCustomPayloadRecordItemProcessor())
            .writer(myCustomerWriter())
            .faultTolerant()
            .skipPolicy(new AlwaysSkipItemSkipPolicy())
            .skip(DataValidationException.class)
            .listener(new CustomReaderListener())
            .listener(new CustomProcessListener())
            .listener(new CustomWriteListener())
            .listener(new CustomSkipListener())
            .taskExecutor(taskExecutor())
            .throttleLimit(maxThreads)
            .build();
    step.registerStepExecutionListener(stepExecutionListener());
    step.registerChunkListener(new CustomChunkListener());
    return step;
}

My main method: Application.java

  @Autowired
    @Qualifier("jobToLoadTheData")
    private Job loadTheData;

    @Autowired
    private JobLauncher jobLauncher;

    @PostConstruct
    public void launchJob () throws JobParametersInvalidException, JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException
    {
        JobParameters parameters = (new JobParametersBuilder()).addDate("random", new Date()).toJobParameters();
        jobLauncher.run(loadTheData, parameters);
    }

 public static void main(String[] args) {
        SpringApplication.run(Application.class, args);
}

Now, normally I'm reading this .csv from Amazon S3 bucket, but since I'm testing locally, I am just placing the .csv in the project directory and reading it directly by triggering the job in the Application.java main class (as you can see above). Also, I do have some other beans defined in this BatchConfiguration class but I don't want to over-complicate this post more than it already is and from the googling I've done, the problem possibly is with the methods I posted (hopefully).

Also, I would like to point out, similar to one of the other posts on Google/stackoverflow with a user having a similar problem, I created a @RestController endpoint that simply calls the .run() method the JobLauncher and I pass in the JobToLoadTheData Bean, and it triggers the batch insert. Guess what? Data persists to the database just fine, even on spring >= 2.2.1.

What is going on here? is this a clue? is something funky going wrong with some type of entity or transaction manager? I'll take any advice tips! I can provide any more information that you guys may need , so please just ask.

Upvotes: 3

Views: 4178

Answers (1)

Mahmoud Ben Hassine
Mahmoud Ben Hassine

Reputation: 31730

You are defining a bean of type JobRepository and expecting it to be picked up by Spring Batch. This is not correct. You need to provide a BatchConfigurer and override getJobRepository. This is explained in the reference documentation:

You can customize any of these beans by creating a custom implementation of the
BatchConfigurer interface. Typically, extending the DefaultBatchConfigurer
(which is provided if a BatchConfigurer is not found) and overriding the required
getter is sufficient.

This is also documented in the Javadoc of @EnableBatchProcessing. So in your case, you need to define a bean of type Batchconfigurer and override getJobRepository and getTransactionManager, something like:

@Bean
public BatchConfigurer batchConfigurer(EntityManagerFactory entityManagerFactory, DataSource dataSource) {
    return new DefaultBatchConfigurer(dataSource) {
        @Override
        public PlatformTransactionManager getTransactionManager() {
            return new JpaTransactionManager(entityManagerFactory);
        }

        @Override
        public JobRepository getJobRepository() {
            JobRepositoryFactoryBean jobRepositoryFactoryBean = new JobRepositoryFactoryBean();
            jobRepositoryFactoryBean.setDataSource(dataSource);
            jobRepositoryFactoryBean.setTransactionManager(getTransactionManager());
            // set other properties
            return jobRepositoryFactoryBean.getObject();
        }
    };
}

In a Spring Boot context, you could also override the createTransactionManager and createJobRepository methods of org.springframework.boot.autoconfigure.batch.JpaBatchConfigurer if needed.

Upvotes: 1

Related Questions