Reputation: 43
I am new to Spring. I have a Job where it reads the file and writes to the Database. If the number of records in the file are more than 8000, i should not process the file and is hould stop the Job Execution. Please suggest what is the better way to do this.
Upvotes: 4
Views: 11673
Reputation: 277
you can implement StepExecutionListener at itemreader. Then you can get readcount which is corresponds your line number.
public class ExampleItemReader implements ItemReader<String>, StepExecutionListener {
public synchronized String read() throws Exception {
return "";
}
@Override
public ExitStatus afterStep(StepExecution executionContext) {
if (executionContext.getReadCount() > 8000) {
return ExitStatus.COMPLETED;
}
return ExitStatus.EXECUTING;
}
@Override
public void beforeStep(StepExecution arg0) {
}
}
advise reading spring batch patterns
Upvotes: 2
Reputation: 5837
Not something specific to spring but there is a class LineNumberReader
in java.io
. You can make use of it and its skip
method to skip a good amount of chars.
Example:
public int getNoOfLines(String fileName) {
LineNumberReader reader = new LineNumberReader(new FileReader(fileName));
reader.skip(Integer.MAX_VALUE); //skips those many chars, if you feel your file size may exceed you can use Long.MAX_VALUE
return reader.getLineNumber();
}
This is efficient than just reading the file and counting.
Upvotes: 1
Reputation: 15092
I'm assuming that the records in your file are not fixed length. If so, File.length() will easily calculate the number of records in the file.
If not, does it need to be exactly 8000 or around 8000? If it's a rough limit, I would get an average record length for these files and then use File.length() to estimate the number of records.
Upvotes: 0