Lodewijck
Lodewijck

Reputation: 416

How to limit user uploads in S3

I'm writing an application which will run on AWS. It contains user uploads like images and videos and I want to store that data in S3. I'm wondering how to prevent one user uploading gigabytes of data as I will pay for it. I can:

But if someone wants, he could make several accounts and fill my storage with trash. I can also check my bucket size, before every upload, and only upload if my bucket volume < 50GB for example. But these computations are very expensive for some simple uploads.

Should I change to another storage service where I can put a limit? Or is there a common way to solve my issue? Or should I just trust my users?

Upvotes: 6

Views: 3058

Answers (1)

helloV
helloV

Reputation: 52453

Without knowing your use case in detail, it is difficult to recommend a solution. There is no one solution, but a combination of solutions:

  • Make your bucket private - Don't let anyone upload it directly, instead generate a signed S3 URL for each request with a very short expiration (say 5 mins) and let the user upload his/her image with the signed URL
  • Use AWS Lambda (very cheap) to monitor your bucketPutObject. The way it works is a PutObject in your bucket will trigger a lambda function which receives some information about the object including size and IP address. You can write some simple Python/node.js/Java application to track and store the size and IP in some DB (either a micro Redis or DynamoDB). If you see too many uploads or large upload from a particular IP, generate a IAM policy to dynamically block that IP, attach it to your bucket and send an SES email to you.
  • Use CloudWatch - You can have CloudWatch send alerts if BucketSize or ObjectCount exceeds a limit
  • Though I haven't used, you can set AWS Billing Alerts so that you know as soon as possible if your billing exceeds a preset threshold instead of getting surprised at the end of the billing cycle

Upvotes: 8

Related Questions