Reputation: 6616
I'm looking for a native library or perhaps relatively simple algorithm to create and apply binary patches for files up to 1GB in size. These are binary database exports and unfortunately there's no other way to get only the changes. The patches don't need to be extremely small; more important are speed and space efficiency (no bsdiff
). The files often have less than 1% change and I would already be satisfied with a patch ten times as large.
Of course there are already quite a few questions about this, but my biggest limitation at this moment is that I need it to be open source with a permissive license (nothing GPL'ish). For that reason, even though I'm very happy with its characteristics, xdelta3
isn't an option, nor is rdiff
.
One approach I found works reasonably well is to not export, but take the database file itself and split it into chunks. Then, less than 20% of them change between two versions. Unfortunately, the backups have to be taken with the database online, which ties me to exporting. And then it doesn't work that well anymore.
Upvotes: 0
Views: 806
Reputation: 91
HDiffPatch: https://github.com/sisong/HDiffPatch
MIT license , can run on: windows,macos,linux,android ...
support diff between large binary files or directories;
diff & patch both support run with limit memory;
Creating a patch: hdiffz -s-1k -c-zlib old_path new_path out_delta_file
(-s is for speed, -m is for deltaSize)
Applying a patch: hpatchz old_path delta_file out_new_path
Upvotes: 1