Reputation: 896
I have a bunch of s3 folders for different projects/clients and I would like to estimate total size (so I can for instance consider reducing sizes/cost). What is a good way to determine this?
Upvotes: 1
Views: 4338
Reputation: 1869
This would do the magic 🚀
for bucket_name in `aws s3 ls | awk '{print $3}'`; do
echo "$bucket_name"
aws s3 ls s3://$bucket_name --recursive --summarize | tail -n2
done
Upvotes: 2
Reputation: 183
If you want to check via console.
Upvotes: 0
Reputation: 2243
As stated here AWS CLI natively supports --query parameter with can determine the size of every object in S3 bucket.
aws s3api list-objects --bucket BUCKETNAME --output json --query "[sum(Contents[].Size), length(Contents[])]"
I hope it helps.
Upvotes: 1
Reputation: 896
I can do this with a combination of Python and the AWS client:
import os
bucket_rows = os.popen('aws s3 ls').split(chr(10))
sizes = dict()
for bucket in bucket_rows:
buck = bucket.split(' ')[-1] # the full row contains additional information
cmd = f"aws s3 ls --summarize --human-readable --recursive s3://{buck}/ | grep 'Total'"
sizes[buck] = os.popen(cmd).read()
Upvotes: 1