Reputation: 10264
I have several huge (2GB each) database in Firebird format. Every few weeks, this data has to be synchronized with an MS SQL Server. Currently this is a long winded process which just truncates the destination tables then copies each row with Database Workbench (a multi DB DBMS). Is there any clever way to do a partial or differential update which would speed up the process or involve less pumping of data between servers?
Upvotes: 2
Views: 1141
Reputation: 1144
Why dont you try making a hash (MD5 or similar) of each record? If you also store/calculate on the fly an MD5 hash of your loaded data then you can quickly determine using primary keys of a table and the MD5 hash if anything has changed, or indeed if a record doesnt exist.
This of course would need to be done on a table by table basis.
We have used a similar method using SQL OLTP data as a source into a DWH, works very nicely!
Upvotes: 1