Reputation: 187
I have a spring batch application to get a file in samba server and generate a new file in a different folder on the same server. However, only ItemReader is called in the flow. What is the problem? Thanks.
BatchConfiguration:
@Configuration
@EnableBatchProcessing
public class BatchConfiguration extends BaseConfiguration {
@Bean
public ValeTrocaItemReader reader() {
return new ValeTrocaItemReader();
}
@Bean
public ValeTrocaItemProcessor processor() {
return new ValeTrocaItemProcessor();
}
@Bean
public ValeTrocaItemWriter writer() {
return new ValeTrocaItemWriter();
}
@Bean
public Job importUserJob(JobCompletionNotificationListener listener) throws Exception {
return jobBuilderFactory()
.get("importUserJob")
.incrementer(new RunIdIncrementer())
.repository(getJobRepository())
.listener(listener)
.start(this.step1())
.build();
}
@Bean
public Step step1() throws Exception {
return stepBuilderFactory()
.get("step1")
.<ValeTroca, ValeTroca>chunk(10)
.reader(this.reader())
.processor(this.processor())
.writer(this.writer())
.build();
}
}
BaseConfiguration:
public class BaseConfiguration implements BatchConfigurer {
@Bean
@Override
public PlatformTransactionManager getTransactionManager() {
return new ResourcelessTransactionManager();
}
@Bean
@Override
public SimpleJobLauncher getJobLauncher() throws Exception {
final SimpleJobLauncher simpleJobLauncher = new SimpleJobLauncher();
simpleJobLauncher.setJobRepository(this.getJobRepository());
return simpleJobLauncher;
}
@Bean
@Override
public JobRepository getJobRepository() throws Exception {
return new MapJobRepositoryFactoryBean(this.getTransactionManager()).getObject();
}
@Bean
@Override
public JobExplorer getJobExplorer() {
MapJobRepositoryFactoryBean repositoryFactory = this.getMapJobRepositoryFactoryBean();
return new SimpleJobExplorer(repositoryFactory.getJobInstanceDao(), repositoryFactory.getJobExecutionDao(),
repositoryFactory.getStepExecutionDao(), repositoryFactory.getExecutionContextDao());
}
@Bean
public MapJobRepositoryFactoryBean getMapJobRepositoryFactoryBean() {
return new MapJobRepositoryFactoryBean(this.getTransactionManager());
}
@Bean
public JobBuilderFactory jobBuilderFactory() throws Exception {
return new JobBuilderFactory(this.getJobRepository());
}
@Bean
public StepBuilderFactory stepBuilderFactory() throws Exception {
return new StepBuilderFactory(this.getJobRepository(), this.getTransactionManager());
}
}
ValeTrocaItemReader:
@Configuration
public class ValeTrocaItemReader implements ItemReader<ValeTroca>{
@Value(value = "${url}")
private String url;
@Value(value = "${user}")
private String user;
@Value(value = "${password}")
private String password;
@Value(value = "${domain}")
private String domain;
@Value(value = "${inputDirectory}")
private String inputDirectory;
@Bean
@Override
public ValeTroca read() throws MalformedURLException, SmbException, IOException, Exception {
File tempOutputFile = getInputFile();
DefaultLineMapper<ValeTroca> lineMapper = new DefaultLineMapper<>();
lineMapper.setLineTokenizer(new DelimitedLineTokenizer() {
{
setDelimiter(";");
setNames(new String[]{"id_participante", "cpf", "valor"});
}
});
lineMapper.setFieldSetMapper(
new BeanWrapperFieldSetMapper<ValeTroca>() {
{
setTargetType(ValeTroca.class);
}
});
FlatFileItemReader<ValeTroca> itemReader = new FlatFileItemReader<>();
itemReader.setLinesToSkip(1);
itemReader.setResource(new FileUrlResource(tempOutputFile.getCanonicalPath()));
itemReader.setLineMapper(lineMapper);
itemReader.open(new ExecutionContext());
tempOutputFile.deleteOnExit();
return itemReader.read();
}
Sample of ItemProcessor:
public class ValeTrocaItemProcessor implements ItemProcessor<ValeTroca, ValeTroca> {
@Override
public ValeTroca process(ValeTroca item) {
//Do anything
ValeTroca item2 = item;
System.out.println(item2.getCpf());
return item2;
}
EDIT: - Spring boot 2.1.2.RELEASE - Spring batch 4.1.1.RELEASE
Upvotes: 0
Views: 2912
Reputation: 187
SOLVED: The problem was that even though I'm not using any databases, the spring batch, although configured to have the JobRepository in memory, needs a database (usually H2) to save the configuration tables, jobs, etc.
In this case, the dependencies of JDBC and without H2 in pom.xml were disabled. Just added them to the project and the problem was solved!
Upvotes: 0
Reputation: 31600
Looking at your configuration, here are a couple of notes:
BatchConfiguration
looks good. That's a typical job with a single chunk-oriented step.BaseConfiguration
is actually the default configuration you get when using @EnableBatchProcessing
without providing a datasource. So this class can be removed@Configuration
on ValeTrocaItemReader
and marking the method read()
with @Bean
is not correct. This means your are declaring a bean named read
of type ValeTroca
in your application context. Moreover, your custom reader uses a FlatFileItemReader
but has no added value compared to a FlatFileItemReader
. You can declare your reader as a FlatFileItemReader
and configure it as needed (resource, line mapper, etc ). This will also avoid the mistake of opening the execution context in the read
method, which should be done when initializaing the reader or in the ItemStream#open
method if the reader implements ItemStream
Other than that, I don't see from what you shared why the processor and writer are not called.
Upvotes: 1