BTalker
BTalker

Reputation: 67

Spring Batch architecture

Hi

I am a novice in Spring Batch world and last days I've spent time watching Michael Minella's youtube video, read some documentation and successfully run some demo projects I found on the internet. I think Spring Batch is a hot candidate for our needs. But here is our story.

I am working in a company that developed their own scheduling and batch framework, for more than a decade ago, for their business department. The framework is capable of running DB stored procs, DB functions and dynamic SQLs. Needless to say it is very challenging to maintain it since too many people with various development skills did the coding and they don't work here anymore. Our framework may handle jobs and steps to run sequentially as well as async (as Spring Batch). We have also a Job Repository where we store whole job definitions (users create new jobs via GUI), job instances with its context (in case the server goes down, when server is up it will resume running a job). My questions are following:

  1. Can we create new Spring Batch jobs dynamically (either via XML og code) and via standard SB interfaces store them to the JobRepository DB?

  2. Today, at certain time period, we have up to hundred of job executions simultaneously. They are also reusing a connection pool to the DB. Older Spring Batch ref documentation states JobFactory will create fresh ApplicationContext for each job execution. How can we achieve reusing connection pools if this is the case in Spring Batch.

  3. I know there is a support for continuing failed steps but what if the server/app goes down, will I be able to restart my app and retrieve job instance with its context from JobRepository in order to continue from failed step?

  4. Can a "step1.1" in "job1" be dependent on "step 2.1" from "job2" finishing within last hour? In such scenarios I may be using a step listener on "step1.1" to accomplish this?


Kind regards

Toto

Upvotes: 4

Views: 850

Answers (2)

Michael Minella
Michael Minella

Reputation: 21493

You have a lot of material here to cover, so let me respond one point at a time:

Can we create new Spring Batch jobs dynamically (either via XML or code) and via standard SB interfaces store them to the JobRepository DB?

Can you generate a job definition dynamically? Yes. We do it in Spring XD with regards to the job orchestration piece (the composed job DSL is used to generate an XML file for example.

Does Spring Batch provide facilities to do this? No. You'd have to code it yourself.

Also note that you'd have to store the definition in your own table (the schema defined by Spring Batch doesn't have a table for this).

Today, at certain time period, we have up to hundred of job executions simultaneously. They are also reusing a connection pool to the DB. Older Spring Batch ref documentation states JobFactory will create fresh ApplicationContext for each job execution. How can we achieve reusing connection pools if this is the case in Spring Batch.

You can use parent/child context configurations to reuse beans including a DataSource. Define the DataSource in the parent and then the jobs that depend on it in child contexts.

I know there is a support for continuing failed steps but what if the server/app goes down, will I be able to restart my app and retrieve job instance with its context from JobRepository in order to continue from failed step?

This is really an orchestration concern. Spring Batch, by design, does not address the orchestration of jobs into consideration. This allows you to orchestrate them how you want.

The way I'd recommend handling this is via Spring XD or (depending on your timelines) Spring Cloud Data Flow. These tools provide orchestration capabilities including the redeployment of a job if it goes down. That being said, it won't restart a job that was running if it fails because that typically requires some form of human decision based on use case. However, Spring XD currently (and Spring Cloud Data Flow will) have the capabilities to implement something like this in a pretty straight forward way.

Can a "step1.1" in "job1" be dependent on "step 2.1" from "job2" finishing within last hour? In such scenarios I may be using a step listener on "step1.1" to accomplish this?

In cases like this, I'd start to question how your job is configured. You can use a JobExecutionDecider to decide if a step should be executed or not if it still makes sense.

All things considered, while you can accomplish most of what you're looking for with Spring Batch, using something like Spring XD or Spring Cloud Data Flow will make your life a lot easier.

Upvotes: 2

Artefacto
Artefacto

Reputation: 97845

Can we create new Spring Batch jobs dynamically (either via XML og code) and via standard SB interfaces store them the JobRepository DB?

It is easy to use StepBuilderFactory, FlowBuilder etc. to programatically build the Spring Batch artifacts. You'll probably want to back those artifacts with Spring Beans (to get nice facilities like the step/job spring scopes, injection and so on) and for that you can use prototype, execution scoped and job scoped beans, or even use facilities such as BeanDefinitionBuilder to dynamically create beans.

Older Spring Batch ref documentation states JobFactory will create fresh ApplicationContext for each job execution. How can we achieve reusing connection pools if this is the case in Spring Batch.

The GenericApplicationContextFactory creates a child application context. You can have the "global" beans in the parent application context.

I know there is a support for continuing failed steps but what if the server/app goes down, will I be able to restart my app and retrieve job instance with its context from JobRepository in order to continue from failed step?

Yes, but not that easily.

Can a "step1.1" in "job1" be dependent on "step 2.1" from "job2" finishing within last hour? In such scenarios I may be using a step listener on "step1.1" to accomplish this?

A JobExecutionDecider will likely be the best option there.

Upvotes: 1

Related Questions