Reputation: 31
Our application is built using Spring boot + Gradle. We have got a new requirement to implement spring batch job and it will be triggered using autosys(job scheduling tool) and it has 3 steps performing 3 different operations (file preparation, ftp & audit operations). I have attached @EnableBatchProcessing configuration file for your reference. Here are my questions related to Spring Cloud Data Flow.
@Configuration
@EnableBatchProcessing
public class TaskletsConfig {
@Autowired
private JobBuilderFactory jobs;
@Autowired
private StepBuilderFactory steps;
@Bean
protected Step flatFilePreparation() {
return steps
.get("flatFilePreparation")
.tasklet(new FlatFilePreparation())
.build();
}
@Bean
protected Step ftpFile() {
return steps
.get("ftpFile")
.tasklet(new FtpFile())
.build();
}
@Bean
protected Step auditFilePreparation() {
return steps
.get("auditFilePreparation")
.tasklet(new AuditFilePreparation())
.build();
}
@Bean
protected Step errorStep() {
return steps
.get("errorStep")
.tasklet(new ErrorStep())
.build();
}
@Bean
public Job job() {
return jobs.get("psbijob").start(flatFilePreparation()).on(ExitStatus.FAILED.getExitCode()).to(errorStep())
.from(flatFilePreparation()).on("*").to(ftpFile()).on(ExitStatus.FAILED.getExitCode()).to(errorStep())
.from(ftpFile()).on("*").to(auditFilePreparation()).end().build();
}
@Bean
public JobRepository jobRepository() throws Exception {
MapJobRepositoryFactoryBean factory
= new MapJobRepositoryFactoryBean();
factory.setTransactionManager(transactionManager());
return (JobRepository) factory.getObject();
}
@Bean
public PlatformTransactionManager transactionManager() {
return new ResourcelessTransactionManager();
}
@Bean
public JobLauncher jobLauncher() throws Exception {
SimpleJobLauncher jobLauncher = new SimpleJobLauncher();
jobLauncher.setJobRepository(jobRepository());
return jobLauncher;
}
}
Upvotes: 1
Views: 2069
Reputation: 5651
I will attempt to unpack the questions.
Can Spring cloud data flow project be built using Gradle?
All that Spring Cloud Data Flow (SCDF) requires to create and launch your batch-job is a Spring Boot App (a uber-jar). It doesn't matter whether or not you build the application with Maven, Gradle or by other custom means. Docker is another popular choice and that's the only choice for Kubernetes. In PCF, we recommend the use of maven artifacts, which you can, of course, produce it with Gradle, too.
We are looking for Spring batch admin UI kind of features or operations in Spring Cloud data flow. May i know how can i configure Jobs in Spring Cloud data flow? any sample project?
I'd suggest that you explore the Dashboard section in the reference guide. Also, we have an end-to-end Task and Batch Developer Guide in the SCDF Microsite. Anything and everything you did in Spring Batch Admin can be entirely covered in SCDF, and a lot more features sit on top of it, too. Please take a moment to at least review what all are included.
Do we need to add any services/plugins/space must be allocated in PCF infrastructure to support Spring Cloud data flow?
No special requirement. SCDF is simply just a Boot app, too. You can push it manually to your Org/Space or use the SCDF Tile for PCF (a fully managed marketplace service) to provision SCDF and the related components (Security / Upgrades / DB / Broker) automatically.
Is Spring Cloud data flow project production ready ?. can this be used in Production?
The 1.0 GA of SCDF was released back in July 2016. We are currently at 2.1 GA version. There have been 50+ production releases since the first GA milestone. The SCDF tile for PCF is in production for >1yr now also. And, in PCF specifically, we have several customers using SCDF in production.
Upvotes: 1