Reputation: 3630
I am trying to create a super simple file upload script using the boto library, not any others. From what I have tried it feels like it should work, but it doesn't.
The error I am getting now is:
S3ResponseError: 400 Bad Request
Here is the code I have in my view:
def upload_file(request):
if request.method == 'POST':
form = UploadFileForm(request.POST, request.FILES)
file = request.FILES['file']
filename = file.name
conn = boto.connect_s3()
bucket = conn.create_bucket('some-bucket-name')
from boto.s3.key import Key
k = Key(bucket)
k.key = filename
k.send_file(file)
k.content_type = mimetypes.guess_type(filename)[0]
k.set_contents_from_stream(file.chunks())
k.set_acl('public-read')
return HttpResponseRedirect('/')
else:
form = UploadFileForm()
return render_to_response('home/upload.html',
{'form':form},
context_instance=RequestContext(request))
If I modify it to save locally it works so it is the upload to s3 that is broken. I have tested set_contents_from_string
and that works for string data. However, anything which deals with files or streams I get the above error. Am I missing a setting somewhere or is what I am doing just completely wrong?
Upvotes: 1
Views: 3354
Reputation: 4515
I ran into this exact problem trying to transfer a file to S3. Eventually, I figured out that I had to set the size
property on the Key
object before calling send_file
.
k = Key(bucket)
k.key = 'some-key'
k.size = 12345
k.send_file(file)
The size can be found using seek and tell on the file. The following will find the size while preserving the current file position. In your case, you can dispense with remembering the current position and just seek back to zero after getting the file size.
position = file.tell()
file.seek(0, os.SEEK_END)
size = file.tell()
file.seek(position)
Upvotes: 2
Reputation: 2109
I'd be inclined to test your s3 connection and bucket creation step in the shell.
python manage.py shell
I'm curious as to whether or not it's those steps that are tripping you up. For instance if the bucket name you specify isn't globally unique you will receive an error (unsure if it would result in the error code you've received but that is the first place I'd check).
If that is the issue you might consider setting a bucket in the AWS Management Console, then connect to it from your view, and upload files using appropriate 'folder-like' keys based on the needs of your project (see: Amazon S3 boto - how to create a folder?).
Upvotes: 0