beansontoast
beansontoast

Reputation: 181

s3 bucket returns NoneType

Exact same issue as here: Can't collectstatic to s3 via Heroku using boto - s3 bucket returns a NoneType

This still does not fix the issue.

Important part of my settings.py

from base64 import b64decode
from storages.backends.s3boto import S3BotoStorage


DEFAULT_FILE_STORAGE = 'storages.backends.s3boto.S3BotoStorage'
STATICFILES_STORAGE = 'storages.backends.s3boto.S3BotoStorage'

AWS_S3_SECURE_URLS = False       # use http instead of https
AWS_QUERYSTRING_AUTH = False     # don't add complex authentication-related query parameters for requests
AWS_S3_ACCESS_KEY_ID = <snip>     # enter your access key id
AWS_S3_SECRET_ACCESS_KEY = <snip> # enter your secret 
AWS_STORAGE_BUCKET_NAME = 'mybucket/images/'
S3_URL = 'http://%s.s3.amazonaws.com/' % AWS_STORAGE_BUCKET_NAME
STATIC_URL = S3_URL

Upon runserver, collectstatic or executing from celery. I get the same error:

File "/virtualenv/path/to/site-package/boto/s3/connection.py", line 94, in build_auth_path
path = '/' + bucket
TypeError: cannot concatenate 'str' and 'NoneType' objects

Environment variables are set for keyid, access key and bucket name. Interestingly changing

S3_URL = 'http://%s.s3.amazonaws.com/' % AWS_STORAGE_BUCKET_NAME

to

S3_URL = 'http://%s.s3.amazonaws.com' % AWS_STORAGE_BUCKET_NAME

Results in an even stranger error:

    from django.db.models import signals, sql
ImportError: cannot import name signals

The website was otherwise working fine before I started trying this s3 integration.

Upvotes: 1

Views: 1734

Answers (1)

bradenm
bradenm

Reputation: 2170

Remove this line from your settings.py:

from storages.backends.s3boto import S3BotoStorage

It's not needed for anything, and it's causing this error. When you import S3BotoStorage, that class tries to get the bucket name and other settings from settings.py. To avoid an infinite import loop, Python will only let the S3BotoStorage class import variables from settings.py that were set before the import S3BotoStorage line. All other variables will show up as None, which includes that important AWS_STORAGE_BUCKET_NAME setting.

As a side note, your bucket name setting (AWS_STORAGE_BUCKET_NAME) should not contain a '/', which is not allowed in a bucket name. That may cause other errors for you after this one is fixed. You can include the full path in your S3_URL setting, though.

Upvotes: 1

Related Questions