Reputation: 3945
I have the following use case: I need to process a big number of files. Each processing looks more or less like this:
1) read file
2) perform operation (a) on the content
3) perform operation (b) on the content
4) perform operation (c) on the content
...
n) delete file
Spring Batch seems like a good solution to this problem with one exception: I don't want to read all the files in step 1), pass all of them to step 2) etc. because it would use up a lot of memory.
EDIT: I commit my files to memory (not to the DB). This is why I'd prefer to process the files one by one or in batches. I mean: run all steps on a single file/batch (the file/batch gets removed in the last step, memory gets cleaned up), then proceed to the next file/batch and so on.
Does Spring Batch have a mechanism supporting multiple execution of all steps over and over? Or should I just run the same job multiple times until I've run out of files?
Thanks and best regards, Peter
Upvotes: 1
Views: 7946
Reputation: 10649
In your simple case for N files you need to execute N jobs, each one is passed a file name as JobParameter
. Each your processing step cannot be expressed in terms of Spring Batch, but you can use CompositeItemProcessor
to chain your processors.
Upvotes: 1
Reputation: 6630
in Spring Batch Doc this is handled under multi-file input
it works with one step and what this will do is:
Upvotes: 2