Vivek
Vivek

Reputation: 166

How to set retry for "aws s3 cp" command?

I have a jenkins job which uploads a pretty small bash file(less than <1mb) to s3 bucket. It works most of the time but fails once in a while with the following error:

upload failed: build/xxxxxxx/test.sh The read operation timed out

Above error is directly from the aws cli operation. I am thinking, It could either be some network issue or maybe disk read operation is not available at the time. How do I set the option to retry it if this happens? Also, Is there a timeout I can increase? I searched the cli documentation, googled, and checked out 'aws s3api' but don't see any such an option.

If such an option does not exist.Then, How do folks get around this? Wrap the command to check the error code and reattempt?

Upvotes: 7

Views: 4982

Answers (2)

stason
stason

Reputation: 6561

The AWS CLI docs suggest to set the env var: export AWS_MAX_ATTEMPTS=3

Full documentation is here

Upvotes: 2

Vivek
Vivek

Reputation: 166

End up writing wrapper around s3 command to retry and also get debug stack on last attempt. Might help folks.

# Purpose: Allow retry while uploading files to s3 bucket
# Params:
#   \$1 : local file to copy to s3
#   \$2 : s3 bucket path
#   \$3 : AWS bucket region
#
function upload_to_s3 {
    n=0
    until [ \$n -gt 2 ]
    do
        if [ \$n -eq 2 ]; then
            aws s3 cp --debug \$1 \$2 --region \$3
            return \$?
        else
            aws s3 cp \$1 \$2 --region \$3 && break
        fi
        n=\$[\$n+1]
        sleep 30
   done
}

Upvotes: 7

Related Questions