art.zhitnik
art.zhitnik

Reputation: 624

"s3cmd get" rewrites local files

Trying to download S3 directory to local machine using s3cmd. I'm using the command:

s3cmd sync --skip-existing s3://bucket_name/remote_dir ~/local_dir

But if I restart downloading after interruption s3cmd doesn't skip existing local files downloaded earlier and rewrites them. What is wrong with the command?

Upvotes: 7

Views: 4224

Answers (2)

Alex F
Alex F

Reputation: 896

I had the same problem and found the solution in comment # 38 from William Denniss there http://s3tools.org/s3cmd-sync

If you have:

$s3cmd sync —verbose s3://mybucket myfolder

Change it to:

$s3cmd sync —verbose s3://mybucket/ myfolder/   # note the trailing slash

Then, the MD5 hashes are compared and everything works correctly! —skip-existing works as well.

To recap, both —skip-existing and md5 checks won’t happen if you use the first command, and both work if you use the second (I made a mistake in my previous post, as I was testing with 2 different directories).

Upvotes: 16

Joe Van Dyk
Joe Van Dyk

Reputation: 6940

Use boto-rsync instead. https://github.com/seedifferently/boto_rsync

It correctly syncs only new/changed files from s3 to the local directory.

Upvotes: 3

Related Questions