Decoy_Octopus_
Decoy_Octopus_

Reputation: 11

Is there a simple way to manage files stored on a S3 bucket (via Mac Finder or FTP Client)?

I need to upload files (mainly pdf documents) into folders within an Aws S3 bucket.

Example :
PDF files containing keyword accounting in filename need to go to accounting folder
PDF files containing keyword HR in filename, need to go to HR Folder
PDF files containing keyword ops in filename, need to go to operations folder

and so on.

So far I've been using the AWS S3 interface using firefox (or any browser actually) but it is very tedious

Is it possible to connect/access the bucket as a remote directory and manage files (move, rename, drag and drop upload) via Mac Os Finder, or Windows explorer, an FTP Client or even via CLI (i'm familiar with bash basics) ?

Thanks

So far, i'm only able to access the bucket via the web browser, double click to go through directories until i reach the desired folder, upload the files that need to go there, go back up in the directory tree, access a different folder, upload files etc...

It works but I'm sure there is a more efficient way.

Upvotes: -1

Views: 1383

Answers (2)

lostmarien
lostmarien

Reputation: 71

What can I say, decided to try Cyberduck first. The interface is bearable, but not completely intuitive. There were several times when the file errors came out during upload and needed to be restarted. I was a bit disappointed and went to test Commander One, it’s been great. Simple to use, handles uploads and downloads smoothly, and saves a lot of time.

I was comparing by functionality, both software are almost the same, perhaps Cyberduck performs better as an FTP Client, it needs to be tested.

Upvotes: 1

John Rotenstein
John Rotenstein

Reputation: 269490

There are many apps that know how to communicate with Amazon S3.

For the Mac, some examples are Cyberduck and Commander One. They allow drag-and-drop copying to/from S3.

You can also use the AWS Command-Line Interface (CLI) to copy files from the Terminal window.

You could also script your requirement to move files to different destinations based on filename:

aws s3 cp . s3://my-bucket/accounting/ --recursive --exclude "*" --include "*accounting*"
aws s3 cp . s3://my-bucket/HR/ --recursive --exclude "*" --include "*HR*"
aws s3 cp . s3://my-bucket/operations/ --recursive --exclude "*" --include "*ops*"

These commands will find any files in the current directory that include a particular word and then copy them to S3 into the correct directory.

See: s3 — AWS CLI Command Reference

Upvotes: 2

Related Questions