Reputation: 110
I was working on boto3 module in python and I have had created a bot which would find the publicly accessible buckets, but this is done for a single user with his credentials. I am thinking of advancing the features and make the bot fetch all the publicly accessible buckets throughout every user accounts. I would like to know if this is possible, if yes how, if not why?
Upvotes: 0
Views: 2268
Reputation: 179084
This is not possible.
There is no way to discover the names of all of the millions of buckets that exist. There are known to be at least 2,000,000,000,000 objects stored in S3, a number announced several years ago and probably substantially lower than the real number now. If each bucket had 1,000,000 of those objects, that would mean 2,000,000 buckets to hold them.
You lack both the time and the permission to scan them all, and intuition suggests that AWS Security would start to ask questions, if you tried.
Upvotes: 1
Reputation: 2970
Look into the method get_bucket_acl()
.
If a bucket is public, you should see an ACL fro Grantee
-> http://acs.amazonaws.com/groups/global/AllUsers'
Example:
>> client.get_bucket_acl(Bucket='PublicBucket')
....
{'Grantee': {u'Type': 'Group',
'URI': 'http://acs.amazonaws.com/groups/global/AllUsers'},
'Permission': 'READ'}], ...
As you can see AllUsers
global group is allowed READ
access on this bucket.
You might also want to check get_bucket_policy()
and make sure if there is a policy on the bucket, it does not allow public access.
Example:
{
"Version":"2012-10-17",
"Statement":[
{
"Sid":"AddPerm",
"Effect":"Allow",
"Principal": "*",
"Action":["s3:GetObject"],
"Resource":["arn:aws:s3:::examplebucket/*"]
}
]
}
"Principal": "*"
indicates everyone will have access to action S3:GetObject
on examplebucket/*
This article might help as well.
Upvotes: 1