ftrujillo
ftrujillo

Reputation: 1192

Execute multiple spring batch jobs concurrently with different parameters

I have one spring batch job configured which run inside an spring webservice. The job has a few steps. I have deployed two instances of this webservice in different tomcats (but both instances uses the same mysql database).

I would like to run a spring batch job concurrently in both tomcats (one in each one) with different parameters. I am not using partioning and the paramaters for each job are completely different.

I start the job in one of the tomcats and everything looks fine. But when I start a second job in the second tomcat, the job is created but it did not start, not even execute the first line od code of the first step.

I am not an expert using spring batch, so maybe I am doing something wrong. But if the spring batch job is running in two separate tomcat instances, they should run in parallel no?

This is the job configuration:

   <?xml version="1.0" encoding="UTF-8"?>
    <beans 
        xmlns="http://www.springframework.org/schema/beans" 
        xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance"
        xmlns:batch="http://www.springframework.org/schema/batch"
        xsi:schemaLocation="
            http://www.springframework.org/schema/beans 
            http://www.springframework.org/schema/beans/spring-beans.xsd
            http://www.springframework.org/schema/batch 
            http://www.springframework.org/schema/batch/spring-batch-3.0.xsd">

        <job id="uploadProjectDataJobNormal " xmlns="http://www.springframework.org/schema/batch">
            <step id="setupProject" next="loadReferenceBuilds">
                <tasklet ref="projectSetupTasklet"/>
                <listeners>
                    <listener ref="promotionListener"/>
                    <listener ref="snpAwareStepListener"/>
                    <listener ref="snpAwareItemReadListener"/>
                </listeners>
            </step>

        <step id="loadReferenceBuilds" next="snpToMorph">
            <tasklet>
                <chunk reader="faiReader" processor="faiProcessor" writer="faiWriter" commit-interval="100"/>
            </tasklet>
            <listeners>
                <listener ref="promotionListener"/>
                <listener ref="snpAwareStepListener"/>
                <listener ref="snpAwareItemReadListener"/>
            </listeners>
        </step>

        <step id="snpToMorph" next="indelToMorph">
            <tasklet>
                <chunk reader="snpReader" processor="snpProcessor" writer="snpWriter" commit-interval="100"/>
            </tasklet>
            <listeners>
                <listener ref="promotionListener"/>
                <listener ref="snpAwareStepListener"/>
                <listener ref="snpAwareItemReadListener"/>
            </listeners>
        </step>

         <step id="indelToMorph">
            <tasklet>
                <chunk reader="indelReader" processor="indelProcessor" writer="indelWriter" commit-interval="100"/>
            </tasklet>
            <listeners>
                <listener ref="promotionListener"/>
                <listener ref="snpAwareStepListener"/>
                <listener ref="snpAwareItemReadListener"/>
            </listeners>
        </step>
<listeners>
            <listener ref="snpAwareBatchJobListener"/>
        </listeners>
</job>

This is how I start the jobs:

this.jobLauncher.run(this.uploadProjectDataJobNormal, jobParameters);

The job paramaters have certain parameters which are unique between the two jobs like a date, name of the element which I want to upload.

The job repository and launcher are configured in the next way:

/**
 * Job repository.
 * 
 * @return the job repository.
 * @throws Exception in case the job repository could not be created.
 */
@Bean
public JobRepository jobRepository() throws Exception {
    JobRepositoryFactoryBean jobRepositoryFactory = new JobRepositoryFactoryBean();
    jobRepositoryFactory.setDataSource(this.persistenceConfig.dataSource());
    jobRepositoryFactory.setTransactionManager(this.persistenceConfig.transactionManager());
    jobRepositoryFactory.setIsolationLevelForCreate("ISOLATION_DEFAULT");
    return jobRepositoryFactory.getJobRepository();
}

/**
 * Job launcher.
 * 
 * @return the job launcher.
 * @throws Exception in case the job launcher could not be created.
 */
@Bean
public JobLauncher jobLauncher() throws Exception {
    SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
    jobLauncher.setJobRepository(this.jobRepository());
    jobLauncher.setTaskExecutor(this.taskExecutor());
    return jobLauncher;
}

/**
 * Task executor.
 * 
 * @return the task executor.
 */
@Bean
public TaskExecutor taskExecutor() {
    SimpleAsyncTaskExecutor ex = new SimpleAsyncTaskExecutor();
    ex.setConcurrencyLimit(1);
    return ex;
}

UPDATE: One solution I thinked about is create a second job declaration with another name like "uploadProjectDataJobNormal2". Will that help?

Upvotes: 2

Views: 5746

Answers (1)

ftrujillo
ftrujillo

Reputation: 1192

At the end the solution was simple than expected. Change the concurrency to 2 in the job launcher:

ex.setConcurrencyLimit(2);

My believe was that will not affect if the spring batch jobs are running in differents JVM, but it does.

Upvotes: 3

Related Questions