Magnus Melwin
Magnus Melwin

Reputation: 1517

AWS S3 cp creates undefined files

While using the aws cli cp command to copy the files recursively, there is a bug which creates some undefined files.

aws s3 cp --recursive $HOME/$MYHOST-$MYTIMESTAMP/$MYHOST-$MYTIMESTAMP-*.xml  s3://mybucket/$MYHOST-$MYTIMESTAMP/

The program works fine and uploads to the specified bucket. But it also creates some undefined files outside the bucket in the root folder. This happens all the time and I have to rm (delete) those annoying undefined files. enter image description here

I presumed it to be a bug and then tried individually uploading the files rather than using wildcards, with the same results as the recursive, it still creates additional undefined files outside the bucket in the root folder again. And this happens only when I run a bunch of the same cp commands in a bash script. In this case the problem is intermittently showing up.

aws s3 cp  $HOME/$MYHOST-$MYTIMESTAMP/$MYHOST-$MYTIMESTAMP-hello.xml  s3://mybucket/$MYHOST-$MYTIMESTAMP/

However while doing it only for a single file, it doesn't show up. My Cli version -

aws-cli/1.14.34 Python/2.7.14+ Linux/4.4.104-39-default botocore/1.8.38

normal output Any help would be highly appreciated on this.

Upvotes: 1

Views: 1089

Answers (1)

Michael - sqlbot
Michael - sqlbot

Reputation: 179404

You have configured S3 access logging to write logs into this bucket. Presumably, these are the log files for this bucket.

Why the filenames begin with "undefined" is not clear -- something may have gone wrong when you set up logging for the bucket so that the log file prefix did not get saved -- but the filenames look like the names of the log files that S3 creates.

https://docs.aws.amazon.com/AmazonS3/latest/dev/ServerLogs.html

Best practice is to set up a separate bucket for collecting S3 access logs in each region.

Upvotes: 3

Related Questions