Reputation: 416
I'm writing an application which will run on AWS. It contains user uploads like images and videos and I want to store that data in S3. I'm wondering how to prevent one user uploading gigabytes of data as I will pay for it. I can:
But if someone wants, he could make several accounts and fill my storage with trash. I can also check my bucket size, before every upload, and only upload if my bucket volume < 50GB for example. But these computations are very expensive for some simple uploads.
Should I change to another storage service where I can put a limit? Or is there a common way to solve my issue? Or should I just trust my users?
Upvotes: 6
Views: 3058
Reputation: 52453
Without knowing your use case in detail, it is difficult to recommend a solution. There is no one solution, but a combination of solutions:
PutObject
. The way it works is a PutObject
in your bucket will trigger a lambda function which receives some information about the object including size and IP address. You can write some simple Python/node.js/Java application to track and store the size and IP in some DB (either a micro Redis
or DynamoDB
). If you see too many uploads or large upload from a particular IP, generate a IAM
policy to dynamically block that IP, attach it to your bucket and send an SES
email to you.BucketSize
or ObjectCount
exceeds a limitUpvotes: 8