Reputation: 2580
I'm trying to configure two jobs with @EnableBatchProcessing(modular = true)
. This is to prevent naming clash as far as I understand.
Here's my job configuration:
@Configuration
public class Dummy1 {
@Autowired
JobBuilderFactory jobBuilderFactory;
@Autowired
StepBuilderFactory stepBuilderFactory;
@Bean
public Step step() {
// < build step. Omitted for code clarity >
}
@Bean
public Job getJob() {
return jobBuilderFactory.get("dummy-job-1")
.start(step())
.build();
}
}
I have a similar class named Dummy2
.
I also defined the following configuration:
@Configuration
@EnableAutoConfiguration
@EnableBatchProcessing(modular = true)
public class BatchConfig {
@Bean
public ApplicationContextFactory getDummy1() {
return new GenericApplicationContextFactory(Dummy1.class);
}
@Bean
public ApplicationContextFactory getDummy2() {
return new GenericApplicationContextFactory(Dummy2.class);
}
}
When running the application I'm getting:
The bean 'step', defined in class path resource [~PATH~/Dumm2.class], could not be registered. A bean with that name has already been defined in class path resource [~PATH~/Dumm1.class] and overriding is disabled.
But I thought this is the all point of modular=true. That is, to handle names clashes.
On the other hand, if I'm enabling bean overriding I'm left with the second job overriding the first one.
i.e. @Autowired List<Job>
has only one job (from Dummy2.class
)
How to configure these jobs correctly?
Upvotes: 3
Views: 2195
Reputation: 21493
@EnableBatchProcessing
is a Spring Batch annotation that pre-dates Spring Boot. As such, you need to consider how it works within the context of Spring Boot. I took a look at your sample application. Let me first explain what is going wrong, then I'll explain how to fix it.
The Problem
When you configure @EnableBatchProcessing(modular=true)
, per the javadoc, you should have no @Bean
definitions in the current context that you do not want to be bootstrapped. Instead, you provide ApplicationContextFactory
implementations as @Bean
, each of which defines the child context for a job.
However, in your application there is a catch. As noted, @EnableBatchProcessing
pre-dates Spring Boot and as such you need to think about how it works within the context of Spring Boot. In your case, the sample application has all the classes in the same package. By default, Spring Boot will do a classpath scan for @Configuration
annotated classes in the package that you define a class annotated with @SpringBootApplication
and "below". So in your sample application, Spring Boot is bringing in Dummy1
and Dummy2
automatically into what should be the parent context causing your error.
The Solution
To fix this issue, you need to prevent Spring Boot from including your child context configurations with its classpath scanning. To prove this, I tested your sample app by moving Dummy1
and Dummy2
to the package com.example
(one level above the class annotated with @SpringBootApplication
). This prevented Spring Boot from picking them up with its classpath scanning and allowed the app to start correctly...with two other minor tweaks:
Dummy1
and Dummy2
, you configured both job names to be dummy-job-1
. Change one to be dummy-job-2
and that addresses this issue.With those changes, you will be able to build your application and run it via the command: java -jar target/demo-0.0.1-SNAPSHOT.jar --spring.batch.job.names=dummy-job-1
to run job 1 or java -jar target/demo-0.0.1-SNAPSHOT.jar --spring.batch.job.names=dummy-job-2
to run job 2.
There are many other ways to move the classes around to prevent Spring Boot's classpath scanning to pick them up and as long as Spring Boot doesn't pick up Dummy1
and Dummy2
, the app should work as you expect. Good luck!
Upvotes: 1