Jason Kealey
Jason Kealey

Reputation: 7986

Ways to synch many (small) files over high-latency network connection

We typically deploy our software applications to our clients using Subversion (svn update on the clients; unidirectional). We're currently experiencing problems with one of our clients because of the high latency (large file download speeds are good) because they are in China and our server is in Canada. Subversion simply times out with an error after a very long period of time.

Our application has lots of small files (.aspx, .config, etc.) and a few larger files (.dll, .jpg) for a total of about 100mb-200mb.

I am currently considering doing the following:

  1. Do a local svn checkout on the server
  2. Zip the result
  3. FTP or rsync the large zip file to the foreign machine
  4. Unzipping the file in a temporary folder.
  5. Doing a local rsync from that temp folder to our typical installation folder.

Are there any better solutions?

Upvotes: 2

Views: 2469

Answers (3)

Yann Ramin
Yann Ramin

Reputation: 33177

You can use git (perhaps with git-svn) to handle the transfer. Its amazingly efficient for moving differences in file versions.

Otherwise, you can use an xdiff binary diff tool.

Upvotes: 0

jpdaigle
jpdaigle

Reputation: 1265

Don't knock rsync for the whole tree of small files until you've given it a shot. It doesn't do a round-trip for every single file, it's pipelined, so it should be as fast as anything else on the whole dataset. (As fast as TCP can reassemble the frames into ordered packets on your high-latency link.)

Check out how rsync works for explanation of how it avoids round-trips.

Upvotes: 4

nholling
nholling

Reputation: 41

You could create a unix style patch of all changes across all files. And just transfer that in a zip file.

Upvotes: 2

Related Questions