user1427661
user1427661

Reputation: 11774

Mocking a Custom File Storage Backend

I've created a custom file storage backend that calls out to Amazon S3 using boto and stores the files there (I know django-storages handles this as well, but we ran into several issues with it). I'm storing it in a utils module and using it in my models like this:

from utils.s3 import S3Storage

class Photo(models.Model):
  image = models.ImageField(storage=S3Storage(), upload_to="images")

Thus any time a photo is created with an image file, the image file is uploaded to an S3 bucket.

I don't want to make calls out to S3 during my tests, but figuring out exactly what to mock in this situation is difficult. I can't mock out the entire image field, because I need to test creating the model through Tastypie.

Any ideas?

Upvotes: 7

Views: 4854

Answers (3)

trubliphone
trubliphone

Reputation: 4504

Something like this could work for pytest:

import pytest
import os
from django.core.files.storage import get_storage_class

@pytest.fixture
def mock_storage(monkeypatch):
    """
    Mocks the backend storage system by not actually accessing media
    """

    clean_name = lambda name: os.path.splitext(os.path.basename(name))[0]

    def _mock_save(instance, name, content):
        setattr(instance, f"mock_{clean_name(name)}_exists", True)
        return str(name).replace('\\', '/')

    def _mock_delete(instance, name):
        setattr(instance, f"mock_{clean_name(name)}_exists", False)
        pass

    def _mock_exists(instance, name):
        return getattr(instance, f"mock_{clean_name(name)}_exists", False)

    storage_class = get_storage_class()

    monkeypatch.setattr(storage_class, "_save", _mock_save)
    monkeypatch.setattr(storage_class, "delete", _mock_delete)
    monkeypatch.setattr(storage_class, "exists", _mock_exists)

Upvotes: 1

Menda
Menda

Reputation: 1813

I tried many other solutions, like overwriting in settings DEFAULT_FILE_STORAGE or the solution of Manh Tai. The problem of everything is that Django loads in memory all models when initialising, and this makes a bit unintuitive to modify a model attribute once it is set.

Tested with Django 2.1 and Python 3:

from unittest.mock import MagicMock
from django.core.files.storage import Storage
from django.core.files.uploadedfile import SimpleUploadedFile

class CreatePhotoTest(TestCase):
    def test_post_photo(self):
        def generate_filename(filename):
            return filename

        def save(name, content, max_length):
            return name

        storage_mock = MagicMock(spec=Storage, name='StorageMock')
        storage_mock.generate_filename = generate_filename
        storage_mock.save = MagicMock(side_effect=save)
        storage_mock.url = MagicMock(name='url')
        storage_mock.url.return_value = 'http://example.com/generated_filename.png'

        Photo._meta.get_field('image').storage = storage_mock

        img = SimpleUploadedFile('file.png', b"file_content", content_type="image/png")
        data = {
            'signed_contract': img
        }
        response = self.client.post('/endpoint', data, format='multipart')

        self.assertTrue(storage_mock.save.called)
        generated_filename = storage_mock.save.call_args_list[0][0][0]
        uploaded_file = storage_mock.save.call_args_list[0][0][1]
        self.assertEqual(uploaded_file.name, 'file.pdf')

I created generate_filename() and save(), but you don't need to do that if you do not want. This was just done to emulate as much as possible the behaviour of a real Storage and to verify it in the test.

Upvotes: 2

Manh Tai
Manh Tai

Reputation: 376

You can just mock out the _save method in S3Storage class to avoid uploading to S3. You can use FileSystemStorage instead.

My solution for your case would be like this:

import mock
from utils.s3 import S3Storage
from django.core.files.storage import FileSystemStorage


fss = FileSystemStorage()

@mock.patch.object(S3Storage, '_save', fss._save)
def test_something():
    assert True

Upvotes: 5

Related Questions