Matt Joiner
Matt Joiner

Reputation: 118500

Compress file on S3

I have a 17.7GB file on S3. It was generated as the output of a Hive query, and it isn't compressed.

I know that by compressing it, it'll be about 2.2GB (gzip). How can I download this file locally as quickly as possible when transfer is the bottleneck (250kB/s).

I've not found any straightforward way to compress the file on S3, or enable compression on transfer in s3cmd, boto, or related tools.

Upvotes: 37

Views: 71032

Answers (3)

CloudArch
CloudArch

Reputation: 341

There are now pre-built apps in Lambda that you could use to compress images and files in S3 buckets. So just create a new Lambda function and select a pre-built app of your choice and complete the configuration.

  1. Step 1 - Create a new Lambda function
  2. Step 2 - Search for prebuilt app enter image description here
  3. Step 3 - Select the app that suits your need and complete the configuration process by providing the S3 bucket names. enter image description here

Upvotes: 3

Navaneeth Pk
Navaneeth Pk

Reputation: 662

Late answer but I found this working perfectly.

aws s3 sync s3://your-pics .

for file in "$(find . -name "*.jpg")"; do gzip "$file"; echo "$file";  done

aws s3 sync . s3://your-pics --content-encoding gzip --dryrun

This will download all files in s3 bucket to the machine (or ec2 instance), compresses the image files and upload them back to s3 bucket. Verify the data before removing dryrun flag.

Upvotes: 15

Michel Feldheim
Michel Feldheim

Reputation: 18250

S3 does not support stream compression nor is it possible to compress the uploaded file remotely.

If this is a one-time process I suggest downloading it to a EC2 machine in the same region, compress it there, then upload to your destination.

http://docs.aws.amazon.com/AWSEC2/latest/UserGuide/EC2_GetStarted.html

If you need this more frequently

Serving gzipped CSS and JavaScript from Amazon CloudFront via S3

Upvotes: 31

Related Questions