Anish
Anish

Reputation: 912

Store uploaded image to AWS S3

I have a server application written in python/django (REST api) for accepting a file upload from the client application. I want this uploaded file to be stored in AWS S3. I also want the file to be uploaded from client as multipart form / data . How can i achieve this. Any sample code application will help me to understand the way it should be done. Please assist.

 class FileUploadView(APIView):
    parser_classes = (FileUploadParser,)

    def put(self, request, filename, format=None):
        file_obj = request.data['file']
        self.handle_uploaded_file(file_obj)
        return self.get_response("", True, "", {})

    def handle_uploaded_file(self, f):
        destination = open('<path>', 'wb+')
        for chunk in f.chunks():
            destination.write(chunk)
        destination.close()

Thanks in advance

Upvotes: 3

Views: 2199

Answers (2)

kchan
kchan

Reputation: 846

If you want to your uploads to go directly to AWS S3, you can use django-storages and set your Django file storage backend to use AWS S3.

This will allow your Django project to handle storage transparently to S3 without your having to manually re-upload your uploaded files to S3.

Storage Settings

You will need to add at least these configurations to your Django settings:

# default remote file storage
DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'

# aws access keys
AWS_ACCESS_KEY_ID = 'YOUR-ACCESS-KEY'
AWS_SECRET_ACCESS_KEY = 'YOUR-SECRET-ACCESS-KEY'
AWS_BUCKET_NAME = 'your-bucket-name'
AWS_STORAGE_BUCKET_NAME = AWS_BUCKET_NAME

Example Code to Store Upload to Remote Storage

This is a modified version of your view with a the handle_uploaded_file method using Django's storage backend to save the uploade file to the remote destination (using django-storages).

Note: Be sure to define the DEFAULT_FILE_STORAGE and AWS keys in your settings so django-storage can access your bucket.

from django.core.files.storage import default_storage
from django.core.files import File

# set file i/o chunk size to maximize throughput
FILE_IO_CHUNK_SIZE = 128 * 2**10


class FileUploadView(APIView):
    parser_classes = (FileUploadParser,)

    def put(self, request, filename, format=None):
        file_obj = request.data['file']
        self.handle_uploaded_file(file_obj)
        return self.get_response("", True, "", {})

    def handle_uploaded_file(self, f):
        """
        Write uploaded file to destination using default storage.
        """
        # set storage object to use Django's default storage
        storage = default_storage

        # set the relative path inside your bucket where you want the upload
        # to end up
        fkey = 'sub-path-in-your-bucket-to-store-the-file'

        # determine mime type -- you may want to parse the upload header
        # to find out the exact MIME type of the upload file.
        content_type = 'image/jpeg'

        # write file to remote server
        # * "file" is a File storage object that will use your
        # storage backend (in this case, remote storage to AWS S3)
        # * "media" is a File object created with your upload file
        file = storage.open(fkey, 'w')
        storage.headers.update({"Content-Type": content_type})
        f = open(path, 'rb')
        media = File(f)
        for chunk in media.chunks(chunk_size=FILE_IO_CHUNK_SIZE):
            file.write(chunk)
        file.close()
        media.close()
        f.close()

See more explanation and examples on how to access the remote storage here:

Upvotes: 3

AChampion
AChampion

Reputation: 30258

Take a look at boto package which provides AWS APIs:

from boto.s3.connection import S3Connection
s3 = S3Connection(access_key, secret_key)
b = s3.get_bucket('<bucket>')
mp = b.initiate_multipart_upload('<object>')
for i in range(1, <parts>+1):
    io = <receive-image-part>   # E.g. StringIO
    mp.upload_part_from_file(io, part_num=i)
mp.complete_upload()

Upvotes: -1

Related Questions