Pratik Mandrekar
Pratik Mandrekar

Reputation: 9568

Uploading gzipped content to AWS S3 from command line.

I used to used s3cmd and then I had to upgrade it to version 1.5.0-alpha3. I have been unable to do a proper s3cmd sync with the header --add-header="Content-Encoding: gzip"

The command I use is

s3cmd --add-header="Content-Encoding: gzip" --add-header="Cache-Control:public, max-age=86400" --add-header='Content-Type: application/javascript' --recursive put local remote_S3_bucket

The headers are uploaded but the zipped files somehow get unzipped. I have tried with/without the --no-preserve flag and several other permutations of the headers.

I have also tried doing the same with the aws s3 cli but there does not seem to be a way to add the "Content-Encoding: gzip" headers with this tool.

What tool (and version and from which source i.e os package manager, github) should one use as of today to sync gzipped files successfully?

Upvotes: 2

Views: 4324

Answers (2)

koolhead17
koolhead17

Reputation: 1964

Alternatively you can use Minio Client aka mc

A simple operation like below should work.

$ mc cat  | gzip - | mc pipe s3/bucket/file

Hope it helps.

Disclaimer: I work for Minio

Upvotes: 1

Michael - sqlbot
Michael - sqlbot

Reputation: 179054

They are "somehow" getting automatically unzipped by your user agent when you download them... because that's what is supposed to happen when you store gzipped content with proper headers, like those... check the size of the file in the AWS console and I expect you will find that what you stored is smaller than what you downloaded.

When you store gzipped content with Content-Encoding: gzip you also drop the .gz extension because what is going to end up downloaded is no longer gzipped when handed to the end user because the browser, or wget, etc. content-un-encodes it, if you will.

If you just want to put the .gz file out there for download as a .gz file, don't set Content-Encoding.

Upvotes: 3

Related Questions