Jimm
Jimm

Reputation: 8505

configuring multiple versions of job in spring batch

SpringBatch seems to be lacking the metadata for the job definition in the database.

In order to create a job instance in the database, the only thing it considers is jobName and jobParamter, "JobInstance createJobInstance(String jobName, JobParameters jobParameters);"

But,the object model of Job is rich enough to consider steps and listeners. So, if i create a new version of the existing job, by adding few additional steps, spring batch does not distinguish it from the previous version. Hence, if i ran the previous version today and run the updated version, spring batch does not run the updated version, as it feels that previous run was successful. At present, it seems like, the version number of the job, should be part of the name. Is this correct understanding ?

Upvotes: 0

Views: 275

Answers (1)

Dean Clark
Dean Clark

Reputation: 3868

You are correct that the framework identifies each job instance by a unique combination of job name and (identifying) job parameters.

In general, if a job fails, you should be able to re-run with the same parameters to restart the failed instance. However, you cannot restart a completed instance. From the documentation:

JobInstance can be restarted multiple times in case of execution failure and it's lifecycle ends with first successful execution. Trying to execute an existing JobIntance that has already completed successfully will result in error. Error will be raised also for an attempt to restart a failed JobInstance if the Job is not restartable.

So you're right that the same job name and identifying parameters cannot be run multiple times. The design framework prevents this, regardless of what the business steps job performs. Again, ignoring what your job actually does, here's how it would work:

1) jobName=myJob, parm1=foo   , parm2=bar -> runs and fails (assume some exception)
2) jobName=myJob, parm1=foo   , parm2=bar -> restarts failed instance and completes
3) jobName=myJob, parm1=foo   , parm2=bar -> fails on startup (as expected)
4) jobName=myJob, parm1=foobar, parm2=bar -> new params, runs and completes

The "best practices" we use are the following:

  • Each job instance (usually defined by run-date or filename we are processing) must define a unique set of parameters (otherwise it will fail per the framework design)
  • Jobs that run multiple times a day but just scan a work table or something use an incrementer to pass a integer parameter, which we increase by 1 upon each successful completion
  • Any failed job instances must be either restarted or abandoned before pushing code changes that affect the the job will function

Upvotes: 1

Related Questions