ImPurshu
ImPurshu

Reputation: 430

Copy folder with sub-folders and files from server to S3 using AWS CLI

I have a folder with bunch of subfolders and files which I am fetching from a server and assigning to a variable. I want to upload this data to S3 bucket with the same folder structure using boto3. Folder Structure:

└── main_folder
   ├── folder
    │   ├── folder
    │   │   ├── folder
    │   │   │   └── a.json
    │   │   ├── folder
    │   │   │   ├── folder
    │   │   │   │   └── b.json
    │   │   │   ├── folder
    │   │   │   │   └── c.json
    │   │   │   └── folder
    │   │   │       └── d.json
    │   │   └── folder
    │   │       └── e.json
    │   ├── folder
    │   │   └── f.json
    │   └── folder
    │       └── i.json
        try:
            for dirname in utils['BucketData']:
                cmd = ['aws', 's3', 'sync', dirname, 's3://{}/{}/'.format(input_bucket_name, folder_name)]
                sp = subprocess.run(cmd)
        except Exception as e:
            logger.debug('Command error: {}'.format(e))

Variable details:

utils['BucketData'] = It is data(mentioned is folder structure) which fetched from server. It is in dict form
dirname = fetching directories and copying to s3 location
input_bucket_name = Bucket name where my folder is located.
folder_name = where I want copy the data.

When I run I am getting

The user-provided path 'directory name' does not exist.

How can I copy the exact data with same folder structure which is assigned to variable in S3? What is wrong here?

Upvotes: 1

Views: 3895

Answers (1)

Walter A
Walter A

Reputation: 20012

(First given as a comment, I was not sure it was a solution for the problem.)

You can use

aws s3 sync localdir "s3://yourbucket/dirname"

Perhaps you need to add the --region=yourplace.

Upvotes: 2

Related Questions