Reputation: 315
i'm building a process in a database where multiple tables are used to write data in a single table (each in a single process). The process should be fault tolerant - in case of an error all the inserted data should be rolled back. In my opinion there are two ways of implementing this kind of logic overall database systems.
Are there maybe more / better options for this kind or problem?
Cheers
Upvotes: 0
Views: 96
Reputation: 1025
Given the number of rows (1m) and the fact that you'll be running the process every 5 mins, I don't think either of the approaches you mentioned would suit -
What you could try is -
Insert and commit data in the target table in batches (of say 5K records - you'll need to tune this after testing for various batch sizes)
The challenge is - no-one should be able to use the committed data until the whole process finishes successfully. To address this,
This would make sure that -
Cons -
Upvotes: 1