Zach Latta
Zach Latta

Reputation: 3331

Access files stored on Amazon S3 through web browser

Current Situation

I have a project on GitHub that builds after every commit on Travis-CI. After each successful build Travis uploads the artifacts to an S3 bucket. Is there some way for me to easily let anyone access the files in the bucket? I know I could generate a read-only access key, but it'd be easier for the user to access the files through their web browser.

I have website hosting enabled with the root document of "." set.

Screenshot of website hosting setup

However, I still get an 403 Forbidden when trying to go to the bucket's endpoint.

403 Forbidden

The Question

How can I let users easily browse and download artifacts stored on Amazon S3 from their web browser? Preferably without a third-party client.

Upvotes: 39

Views: 187204

Answers (5)

Justin Pierce
Justin Pierce

Reputation: 252

https://github.com/jupierce/aws-s3-web-browser-file-listing is a solution I developed for this use case. It leverages AWS CloudFront and Lambda@Edge functions to dynamically render and deliver file listings to a client's browser.

To use it, a simple CloudFormation template will create an S3 bucket and have your file server interface up and running in just a few minutes.

There are many viable alternatives, as already suggested by other posters, but I believe this approach has a unique range of benefits:

  • Completely serverless and built for web-scale.
  • Open source and free to use (though, of course, you must pay AWS for resource utilization -- such S3 storage costs).
  • Simple / static client browser content:
    • No Ajax or third party libraries to worry about.
    • No browser compatibility worries.
  • All backing systems are native AWS components.
  • You never share account credentials or rely on 3rd party services.
  • The S3 bucket remains private - allowing you to only expose parts of the bucket.
  • A custom hostname / SSL certificate can be established for your file server interface.
  • Some or all of the host files can be protected behind Basic Auth username/password.
  • An AWS WebACL can be configured to prevent abusive access to the service.

Upvotes: 0

Mickael Kerjean
Mickael Kerjean

Reputation: 129

Filestash is the perfect tool for that:

  1. login to your bucket from https://www.filestash.app/s3-browser.html:

enter image description here

  1. create a shared link:

enter image description here

  1. Share it with the world

Also Filestash is open source. (Disclaimer: I am the author)

Upvotes: 7

Itay
Itay

Reputation: 732

I had the same problem and I fixed it by using the

  1. new context menu "Make Public".
  2. Go to https://console.aws.amazon.com/s3/home,
  3. select the bucket and then for each Folder or File (or multiple selects) right click and
  4. "make public"

Upvotes: 1

perpetual_check
perpetual_check

Reputation: 1028

You can use a bucket policy to give anonymous users full read access to your objects. Depending on whether you need them to LIST or just perform a GET, you'll want to tweak this. (I.e. permissions for listing the contents of a bucket have the action set to "s3:ListBucket").

http://docs.aws.amazon.com/AmazonS3/latest/dev/AccessPolicyLanguage_UseCases_s3_a.html

Your policy will look something like the following. You can use the S3 console at http://aws.amazon.com/console to upload it.

 {
  "Version":"2008-10-17",
  "Statement":[{
    "Sid":"AddPerm",
      "Effect":"Allow",
      "Principal": {
            "AWS": "*"
         },
      "Action":["s3:GetObject"],
      "Resource":["arn:aws:s3:::bucket/*"
      ]
    }
  ]
}

If you're truly opening up your objects to the world, you'll want to look into setting up CloudWatch rules on your billing so you can shut off permissions to your objects if they become too popular.

Upvotes: 0

Myrne Stol
Myrne Stol

Reputation: 11438

I found this related question: Directory Listing in S3 Static Website

As it turns out, if you enable public read for the whole bucket, S3 can serve directory listings. Problem is they are in XML instead of HTML, so not very user-friendly.

There are three ways you could go for generating listings:

  • Generate index.html files for each directory on your own computer, upload them to s3, and update them whenever you add new files to a directory. Very low-tech. Since you're saying you're uploading build files straight from Travis, this may not be that practical since it would require doing extra work there.

  • Use a client-side S3 browser tool.

  • Use a server-side browser tool.

    • s3browser (PHP)
    • s3index Scala. Going by the existence of a Procfile, it may be readily deployable to Heroku. Not sure since I don't have any experience with Scala.

Upvotes: 33

Related Questions