Reputation: 57
I am studying Spring Batch over Spring Boot following this tutorial: https://www.petrikainulainen.net/programming/spring-framework/spring-batch-tutorial-reading-information-from-a-rest-api/
Here the related GitHub project: https://github.com/pkainulainen/spring-batch-examples/tree/master/reading-data/rest-api
I am finding some difficulties trying to understand what is the exact execution flow and how can I modify it in order to implement my custom behavior. I try to explain my doubts here:
This project contains the SpringBatchExampleJobConfig configuration class where are define the beans used by the Spring bean factory. These beans should be injected into the other classes (is it correct?):
@Configuration
public class SpringBatchExampleJobConfig {
private static final String PROPERTY_REST_API_URL = "rest.api.url";
@Bean
public ItemReader<StudentDTO> itemReader(Environment environment, RestTemplate restTemplate) {
return new RESTStudentReader(environment.getRequiredProperty(PROPERTY_REST_API_URL), restTemplate);
}
@Bean
public ItemWriter<StudentDTO> itemWriter() {
return new LoggingItemWriter();
}
/**
* Creates a bean that represents the only step of our batch job.
* @param reader
* @param writer
* @param stepBuilderFactory
* @return
*/
@Bean
public Step exampleJobStep(ItemReader<StudentDTO> reader,
ItemWriter<StudentDTO> writer,
StepBuilderFactory stepBuilderFactory) {
return stepBuilderFactory.get("exampleJobStep")
.<StudentDTO, StudentDTO>chunk(1)
.reader(reader)
.writer(writer)
.build();
}
/**
* Creates a bean that represents our example batch job.
* @param exampleJobStep
* @param jobBuilderFactory
* @return
*/
@Bean
public Job exampleJob(Step exampleJobStep,
JobBuilderFactory jobBuilderFactory) {
return jobBuilderFactory.get("exampleJob")
.incrementer(new RunIdIncrementer())
.flow(exampleJobStep)
.end()
.build();
}
}
Basically it is defining the itemReader bean (that read data from a source), the itemWriter bean (that write data into a destination).
Then it is creating the exampleJobStep bean that define the step of this job: basically the step are: read from the source and pass the data to the writer that will write on the destination. Correct?
Finnally it is creating the exampleJob bean, this one:
@Bean
public Job exampleJob(Step exampleJobStep,
JobBuilderFactory jobBuilderFactory) {
return jobBuilderFactory.get("exampleJob")
.incrementer(new RunIdIncrementer())
.flow(exampleJobStep)
.end()
.build();
}
It should simply execute the steps defined by the exampleJobStep bean.
Is it my understanding correct untill now?
My doubt is: where and how is defined the job start (I mean the job represented by the previous exampleJob bean).
Into the GitHube code you can find the SpringBatchExampleJobLauncher that is where I think that the job is executed. Basically it is autowiring the Job object into the constructor:
@Autowired
public SpringBatchExampleJobLauncher(Job job, JobLauncher jobLauncher) {
this.job = job;
this.jobLauncher = jobLauncher;
}
My doubt is: it is autowiring by type? So it means that defining a bean like this in the beans configuration class:
@Bean
public Job exampleJob(Step exampleJobStep,
JobBuilderFactory jobBuilderFactory) {
return jobBuilderFactory.get("exampleJob")
.incrementer(new RunIdIncrementer())
.flow(exampleJobStep)
.end()
.build();
}
this is the bean that will be autowired into the constructor of the previous SpringBatchExampleJobLauncher class where my job is started?
If I have well understood how it works. Suppose now that I need to implement two different jobs and that I want to start both, what can I do to implement this behavior?
At the moment my SpringBatchExampleJobLauncher class contains
@Autowired
public SpringBatchExampleJobLauncher(Job job, JobLauncher jobLauncher) {
this.job = job;
this.jobLauncher = jobLauncher;
}
@Scheduled(cron = "0/10 * * * * *")
public void runSpringBatchExampleJob() throws JobParametersInvalidException, JobExecutionAlreadyRunningException, JobRestartException, JobInstanceAlreadyCompleteException {
LOGGER.info("Spring Batch example job was started");
jobLauncher.run(job, newExecution());
LOGGER.info("Spring Batch example job was stopped");
}
it autowire (I suspect by type) my one and only Job object, than this job is started by the runSpringBatchExampleJob method.
Ok...my idea to have multiple job is to define two different bean into my UpdateInfoBatchConfig class. But in case what can I do to define two different bean having Job type? and how to correctly inject these two different job beans into my SpringBatchExampleJobLauncher class?
Upvotes: 0
Views: 2268
Reputation: 3889
There is more than one way to solve the problem. The most convenient is to use Spring Boot. You can then add any number of job beans to the application context and Spring Boot will launch all of them on application start-up or only a subset if you use the parameter spring.batch.job.names
. Please have a look the official documentation whether it fits your needs: https://docs.spring.io/spring-boot/docs/2.5.3/reference/html/howto.html#howto.batch
The most straightforward way to go about it, is to use bean names as qualifiers. By default, beans carry the name of the method producing them but you can also set it explicitly. For example, the following code sketch produces two job beans named jobOne
and jobTwo
@Bean
Job jobOne() {
return jobBuilderFactory
...
.build();
}
@Bean("jobTwo") {
Job aMethodNameWithoutInfluenceOnTheBeanName() {
return jobBuilderFactory
...
.build();
}
You can then use the bean names to qualify which job bean you want to have auto-wired:
@Autowired
public SpringBatchExampleJobLauncher(
@Qualifier("jobOne") Job job1,
@Qualifier("jobTwo") Job job2,
JobLauncher jobLauncher
) {
this.job1 = job1;
this.job2 = job2;
this.jobLauncher = jobLauncher;
}
The same trick allows you to select which steps should be auto-wired into which job bean method.
Upvotes: 1