finngu
finngu

Reputation: 527

Django use private S3 storage only in production environment

I have set up my django REST API to use local storage when in DEBUG mode and S3 storage when in production environment. This works well for public files, because I override the DEFAULT_FILE_STORAGE like so:

if IS_DEBUG:
    DEFAULT_FILE_STORAGE = 'api.storage_backends.PublicMediaStorage'

and every FileField uses it automatically. Now I want to use private S3 storage the same way, but because I have to define the storage explicitly (FileField(storage=PrivateMediaStorage())), the S3 storage is always used.

How can I use the local storage instead of S3 storage when in DEBUG mode?

PS: I have already thought about changing the model to either use a FileField with or without an explicit storage depending on the DEBUG mode. This did not fully solve my problem, because my migrations are created in DEBUG mode and thus always contain the model without the private storage class.

UPDATE: I am looking for a solution that can share the same migrations in both environments and only during runtime lazily instantiates the actual storageclass. Just like django handles the DEFAULT_FILE_STORAGE already.

Upvotes: 8

Views: 4823

Answers (5)

finnuss
finnuss

Reputation: 111

As nitsujri rightfully points out, get_storage_class() is deprecated and its a lot easier to use STORAGES now. Note that I use two different buckets for private and public files (as suggested by Thomas):

In my production settings:

STORAGES = {
    "default": {
        "BACKEND": "storages.backends.s3boto3.S3Boto3Storage",
        "OPTIONS": {
            "access_key": env("DJANGO_AWS_ACCESS_KEY_ID"),
            "secret_key": env("DJANGO_AWS_SECRET_ACCESS_KEY"),
            "bucket_name": env("DJANGO_AWS_STORAGE_BUCKET_NAME_PUBLIC"),
            "endpoint_url": env("DJANGO_AWS_S3_ENDPOINT_URL"),
            "querystring_auth": False,
            "file_overwrite": False,
        },
    },
    "private": {
        "BACKEND": "storages.backends.s3boto3.S3Boto3Storage",
        "OPTIONS": {
            "access_key": env("DJANGO_AWS_ACCESS_KEY_ID"),
            "secret_key": env("DJANGO_AWS_SECRET_ACCESS_KEY"),
            "bucket_name": env("DJANGO_AWS_STORAGE_BUCKET_NAME_PRIVATE"),
            "endpoint_url": env("DJANGO_AWS_S3_ENDPOINT_URL"),
            "querystring_auth": True,
            "querystring_expire": env("DJANGO_AWS_S3_QUERYSTRING_EXPIRE"),
            "file_overwrite": False,
        },
    },
    "staticfiles": {
        "BACKEND": "whitenoise.storage.CompressedManifestStaticFilesStorage",
    },
}

and in the models:

def select_private_storage():
    return storages["private"]
file = FileField(
        upload_to="abc/",
        storage=select_private_storage,
    )

in local development mode, I use this:

STORAGES = {
    "default": {
        "BACKEND": "django.core.files.storage.FileSystemStorage",
    },
    "private": {
        "BACKEND": "django.core.files.storage.FileSystemStorage",
    },
    "staticfiles": {
        "BACKEND": "django.contrib.staticfiles.storage.StaticFilesStorage",
    },

}

I never tested this with AWS S3 but it works nicely with Hetzner's S3 Object Storage.

Upvotes: 1

nitsujri
nitsujri

Reputation: 1591

Recently released in Django 4.2 is a storages object.

This removes all the other hacks by moving to a reference-able object:

# settings.py
STORAGES = {
    "default": {
        "BACKEND": "django.core.files.storage.FileSystemStorage",
    },
    "custom_storage": {
        "BACKEND": "django.core.files.storage.FileSystemStorage",
    },
    "staticfiles": {
        "BACKEND": "django.contrib.staticfiles.storage.StaticFilesStorage",
    },
}

# example_app/models.py

from django.core.files.storage import storages

...
    avatar = models.FileField(
        blank=True,
        null=True,
        storage=storages["custom_storage"]
    )

Also, get_storage_class will be deprecated in the future.

Upvotes: 3

Felipe Ferri
Felipe Ferri

Reputation: 3588

Thomas's accepted answer is almost perfect. It has a small migration problem when you work with different settings for local development and production.

Suppose you set storage to FileSystemStorage in local environment and S3PrivateStorage in production. If you run makemigrations in the local environment, the migration file will set the storage field for your FileField to a different value than if you run makemigrations in the production environment.

Fortunately a new feature from Django 3.1 allows us to solve this easily with a slight change to Thomas's answer. Instead of using private_storage, which is an instance of a storage class, let's use the fact that you can use a callable as storage and create a function that will return the proper storage.

Then, the code (adapted from Thomas's answer) would be:

# yourapp.custom_storage.py

from django.conf import settings
from django.core.files.storage import get_storage_class
from storages.backends.s3boto import S3BotoStorage

class S3PrivateStorage(S3BotoStorage):
    """
    Optional   
    """
    default_acl = "private"               # this does the trick

    def __init__(self):
        super(S3PrivateStorage, self).__init__()
        self.bucket_name = settings.S3_PRIVATE_STORAGE_BUCKET_NAME

def select_private_storage():
    # important
    private_storage_class = get_storage_class(settings.PRIVATE_FILE_STORAGE)
    return private_storage_class() # instantiate the storage

and then in your field set the storage accordingly

from yourappp.custom_storage import select_private_storage
...
class YourModel(Model):

    the_file = models.FileField(
        upload_to=..., 
        storage=select_private_storage # notice we're using the callable
    )
...

Upvotes: 2

Thomas Matecki
Thomas Matecki

Reputation: 668

It sounds like the tricky part here is having both public and private media storage in a single project.

The example below assumes you are using django storages, but the technique should work regardless.

Define a private storage by extending the S3BotoStorage class.

If using S3, it is probably prudent to store private and public public in different S3 buckets. This custom storage allows you to specify this parameter via settings.

# yourapp.custom_storage.py

from django.conf import settings
from django.core.files.storage import get_storage_class
from storages.backends.s3boto import S3BotoStorage

class S3PrivateStorage(S3BotoStorage):
    """
    Optional   
    """
    default_acl = "private"               # this does the trick

    def __init__(self):
        super(S3PrivateStorage, self).__init__()
        self.bucket_name = settings.S3_PRIVATE_STORAGE_BUCKET_NAME


# important
private_storage_class = get_storage_class(settings.PRIVATE_FILE_STORAGE)

private_storage = private_storage_class() # instantiate the storage

The important part is the last 2 lines of this file - it declares private_storage for use in your FileField:

from yourappp.custom_storage import private_storage
...
class YourModel(Model):

    the_file = models.FileField(
                   upload_to=..., 
                   storage=private_storage)
...

Finally, in your setting file, something like this should do.

# settings.py

if DEBUG:
    # In debug mode, store everything on the filestystem
    DEFAULT_FILE_STORAGE = 'django.files.storage.FileSystemStorage'
    PRIVATE_FILE_STORAGE = 'django.files.storage.FileSystemStorage'
else:
    # In production store public things using S3BotoStorage and private things
    # in a custom storage
    DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
    PRIVATE_FILE_STORAGE = 'yourapp.custom_storage.S3PrivateStorage'

As a last piece of unsolicited advice: it is often useful to decouple the storage settings from DEBUG mode and allow all of the parameters above to be specified in environment variables. It is likely that at some point you will want to run your app in debug mode using a production-like storage configuration.

Upvotes: 8

JPG
JPG

Reputation: 88429

The best solution is to use FileField without explicit storage class.

# settings.py

if DEBUG:
    DEFAULT_FILE_STORAGE = 'api.storage_backends.PublicMediaStorage'
else:
    DEFAULT_FILE_STORAGE = 'api.storage_backends.PrivateMediaStorage'


# models.py
class Foo(models.Model):
    file = models.FileField() # without storage

During the file upload process, Django will call the DEFAULT_FILE_STORAGE class in a lazy fashion.

Note

These settings won't create a migration file with storage parameter


UPDATE-1

If you want more controll over the storage, create your own custom file field and wire-up in the models

def get_storage():
    """
    Change this function to whatever way as you need
    """
    from api.storage_backends import PublicMediaStorage, PrivateMediaStorage
    if DEBUG:
        return PublicMediaStorage()
    else:
        return PrivateMediaStorage()


class CustomFileField(models.FileField):
    def __init__(self, *args, **kwargs):
        kwargs['storage'] = get_storage() # calling external function
        super().__init__(*args, **kwargs)


class Foo(models.Model):
    file = CustomFileField() # use custom filefield here

Upvotes: 11

Related Questions