deadlock
deadlock

Reputation: 7310

Need to resize and replace millions of images on Amazon S3

I currently use Django for my backend and iOS for my front end. Users can take pictures from the iOS app and upload them to Amazon S3. The app provides Django with the url to the images so that the user can retrieve them later.

It's a very simple setup. However we recently ran into a problem. The images need to be re-sized so that download speeds for the user are faster. Django is backed by a postgresql database which stores all the images links.

This creates another problem since the database already contains url links to the old images so I need to find a way to batch download all the images within a certain bucket, re-size them and then re-upload them to S3 to replace the old image in such a way that it has the same image name as the old image did.

Is there a pythonic way of doing this?

Upvotes: 2

Views: 957

Answers (1)

woozyking
woozyking

Reputation: 5220

Let's just say we use an image library, such as Pillow

Sample usage:

from PIL.Image import Image, ANTIALIAS

img = Image.open('your_image_filename')
ogn_size = img.size  # gives you a tuple (x, y)

# base on ogn_size do proportional resizing, let's say by 50%. also apply the recommended
# downsizing high quality filter ANTIALIAS
image_half = img.resize( int(ogn_size[0] * 0.5), int(ogn_size[1] * 0.5), ANTIALIAS )

# more stuff to do, such as optimization on save
image_half.save( "your_image_filename_half", optimize=True,quality=80 )

Please read the documentation for more options and APIs you may be able to utilize in your production code base https://pillow.readthedocs.org/en/latest/

Upvotes: 1

Related Questions