Reputation: 6583
I'm curious about techniques used to build a system where ensuring that no data is lost is of the utmost priority. For a simplistic example, what does a financial institution do to make sure that when money is transferred between accounts, once it is withdrawn from one account it is without a doubt put in the other account. I'm not so much looking for particular techniques like database transactions, but larger, more architecty concepts, like how the data is saved if a server goes down, or a queue runs out of space, or whatever.
If someone could point me to books or articles on I'd be much obliged.
Upvotes: 2
Views: 187
Reputation: 10225
As you've alluded to, there are various mechanisms (like transactions) for ensuring the software based "handshake" is reliable and completes successfully.
Architectureally - yes having two copies of stuff gives you redundencey which helps not losing stuff. beyond that:
I worked on a solution architecture for an off-the-shelf document management system a while back; no loss of data was the big driver. The system was rolled out nationally, so multi-site in terms of both 'regional' caches for servicing local users, and actual 'data centers'. Some points of interest:
I guess none of this is heavily software centered, but I do think that all the good software architecture / design principles "we" use helped guide my thinking.
Upvotes: 0
Reputation: 2161
You might want to read up on XA or X/Open transactions which can co-ordinate multiple systems including databases, queues, and more into ACID DB-like transactions.
I've not worked with it but I've heard it can be expensive latency-wise and computationally. But then again how much is your data integrity worth?
http://en.wikipedia.org/wiki/X/Open_XA
Upvotes: 0
Reputation: 74655
in the case of the bank example, each bank would keep a record for every transaction stating what how much and to where and from where and how much and their time order
so that later if there is a problem you compare the two transaction logs if they don't match you can identify the missing transactions
this also covers the problem that one bank can't trust another to keep records for it
as they cross check this is almost a distributed transaction protocol
Upvotes: 0
Reputation: 4082
You should read about Automated Teller Machine, Online transaction processing, and others topics about data encryption, also consider use HTTPS if you are thinking on web sites.
Upvotes: 1
Reputation: 24723
It can all really boil down to having the same data in two places; from down to a the code which is holding a cache prior to a commitment of data, all the way up to server redundancy.
The only way to make sure you don't lose something is have multiple copies of it.
Upvotes: 0
Reputation: 639
The basic technique is removing any single point of failure. Anything that can fail in your setup needs to have a back or multiple backups. From Multiple switches, servers, UPSs, harddrives, etc... Databases are constantly being replicated, and data is backed up and stored off site in case of a fire or other disaster which could comprimise the building.
Upvotes: 0