Reputation: 43
I am tasked with providing a lab environment that will be accessed by many customers. In the lab the customers will be sending data into S3 buckets. Ideally they will only send a minimal amount of data(few hundred MB), but I am trying to avoid the situation where someone kicks off a large data transfer by accident or on purpose and lets it run without my knowledge. Is there any way to restrict the amount of data sent to an S3 bucket? or at least notify me of large uploads?
Upvotes: 0
Views: 381
Reputation: 361
You can apply policies on bucket. Below is an example where you can restrict file type, file length.......
"expiration": "'.date('Y-m-d\TG:i:s\Z', time()+10).'",
"conditions": [
{"bucket": "xxx"},
{"acl": "public-read"},
["starts-with","xxx",""],
{"success_action_redirect": "xxx"},
["starts-with", "$Content-Type", "image/jpeg"],
["content-length-range", 0, 104857600]
]
In this case, if the uplaoding file size > 100mb(100 * 1024 * 1024 = 104857600enter code here
), the upload request will be rejected by Amazon.
Upvotes: 1