WestCoastProjects
WestCoastProjects

Reputation: 63022

Use the aws client to copy s3 files from a single directory only (non recursively)

Consider an aws bucket/key structure along these lines

 myBucket/dir1/file1
 myBucket/dir1/file2
 myBucket/dir1/dir2/dir2file1
 myBucket/dir1/dir2/dir2file2

When using:

 aws s3 cp  --recursive s3://myBucket/dir1/ .

Then we will copy down dir2file[1,2] along with file[1,2]. How to only copy the latter files and not files under subdirectories ?

Responding to a comment: . I am not interested in putting a --exclude for every subdirectory so this is not a duplicate of excluding directories from aws cp

Upvotes: 2

Views: 1105

Answers (3)

fiveobjects
fiveobjects

Reputation: 4249

There is no way you can control the recursion depth while copying files using aws s3 cp. Neither it is supported in aws s3 ls.

So, if you do not wish to use --exclude or --include options, I suggest you:

  1. Use aws s3 ls command without --recursive option to list files directly under a directory, extract only the file names from the output and save the names to a file. Refer this post
  2. Then write a simple script to read the file names and for each execute aws s3 cp

Alternatively, you may use:

aws s3 cp s3://spaces/dir1/ .  --recursive --exclude "*/*"

Upvotes: 0

Abhishek Garg
Abhishek Garg

Reputation: 2288

As far as I understood, you want to make sure that the files present in current directories are copied but anything in child directories should not be copied. I think you can use something like that.

aws s3 cp s3://myBucket/dir1/ . --recursive --exclude "*/*"

Here we are excluding files which will have a path separator after "dir1".

Upvotes: 1

matsev
matsev

Reputation: 33749

You can exclude paths using the --exclude option, e.g.

aws s3 cp s3://myBucket/dir1/ . --recursive --exclude "dir1/dir2/*"

More options and examples can be found by using the aws cli help

aws s3 cp help

Upvotes: 1

Related Questions