HyperDevil
HyperDevil

Reputation: 2639

Python 3 Boto 3, AWS S3: Get object URL

I need to retrieve an public object URL directly after uploading a file, this to be able to store it in a database. This is my upload code:

   s3 = boto3.resource('s3')
   s3bucket.upload_file(filepath, objectname, ExtraArgs={'StorageClass': 'STANDARD_IA'})

I am not looking for a presigned URL, just the URL that always will be publicly accessable over https.

Any help appreciated.

Upvotes: 34

Views: 68643

Answers (6)

oferei
oferei

Reputation: 1828

You can generate a presigned URL and then trim its query parameters. This requires the "s3:PutObject" permission for the relevant bucket.

url = s3client.generate_presigned_url(ClientMethod = 'put_object',
                                      Params = { 'Bucket': bucket_name, 'Key': key })

# trim query params
url = url[0 : url.index('?')]

Upvotes: 5

lionels
lionels

Reputation: 882

Since 2010 you can use a virtual-hosted style S3 url, i.e. no need to mess with region specific urls:

url = f'https://{bucket}.s3.amazonaws.com/{key}'

With quoted key :

url = f'''https://{bucket}.s3.amazonaws.com/{urllib.parse.quote(key, safe="~()*!.'")}'''

Moreover, support for the path-style model (region specific urls) continues for buckets created on or before September 30, 2020. Buckets created after that date must be referenced using the virtual-hosted model.

See also this blog post.

Upvotes: 38

Julian Go
Julian Go

Reputation: 4492

Concatenating the the raw key will fail for some special characters in the key(ex: '+'), you have to quote them:

url = "https://s3-%s.amazonaws.com/%s/%s" % (
    location,
    bucket_name,
    urllib.parse.quote(key, safe="~()*!.'"),
)

Or you can call:

my_config = Config(signature_version = botocore.UNSIGNED)
url = boto3.client("s3", config=my_config).generate_presigned_url(
    "get_object", ExpiresIn=0, Params={"Bucket": bucket_name, "Key": key}
)

...as described here.

Upvotes: 7

Shimon
Shimon

Reputation: 172

Just a small note. The function call

location = 
    boto3.client('s3').get_bucket_location(Bucket=bucket_name['LocationConstraint']

may return location = None if the bucket is in the region 'us-east-1'. Therefore, I'd amend the above answer and add a line below that line:

if location == None: location = 'us-east-1'

Upvotes: -3

ofrommel
ofrommel

Reputation: 2177

There's no simple way but you can construct the URL from the region where the bucket is located (get_bucket_location), the bucket name and the storage key:

bucket_name = "my-aws-bucket"
key = "upload-file"

s3 = boto3.resource('s3')
bucket = s3.Bucket(bucket_name)
bucket.upload_file("upload.txt", key)
location = boto3.client('s3').get_bucket_location(Bucket=bucket_name)['LocationConstraint']
url = "https://s3-%s.amazonaws.com/%s/%s" % (location, bucket_name, key)

Upvotes: 33

Related Questions