Reputation: 318
I have to crawl across some configured databases in a table . where each record specifies a schema to be read from . So we have to poll the table and run the job appropriately.
thought to use Spring batch(JdbcPagingItemReader) to read data from all the Schema's configured. if i have to configure this, How can i do it using Spring Batch ?
should i take multiple jobs with different reader for each database to read from, or is there any way where i can send the datasource at runtime for the Spring Batch to read data from ?
How can i manage Multiple databases for Single Spring Batch. If not is there any other suggestion for the Database Crawling(or Harvesting)?
Upvotes: 1
Views: 2336
Reputation: 1964
There are 2 solutions using Spring Batch :
DataSource
properties (url, username, password....) at runtime through JobParameters
. This means that the reading logic of the first read (the one which tells you what to read) has to be done outside the jobJobExecutionContext
and the second one doing the actual read using previously stored values at runtimeUpvotes: 1
Reputation: 1822
If you are just running a query to get some data to then run other queries then this is not really something that aligns with what Spring Batch does. That just a standard JDBC or JPA type DAO/Service setup. You can use Quartz or Spring Scheduler to set a CRON value for when you what it to check the table(s).
Upvotes: 1