pmpjr
pmpjr

Reputation: 41

AWS Cli in Windows wont upload file to s3 bucket

Windows server 12r2 with python 2.7.10 and the aws cli tool installed. The following works:

aws s3 cp c:\a\a.txt s3://path/

I can upload that file without problem. What I want to do is upload a file from a mapped drive to an s3 bucket, so I tried this:

aws s3 cp s:\path\file s3://path/

and it works.

Now what I want to do and cannot figure out is how to not specify, but let it grab all file(s) so I can schedule this to upload the contents of a directory to my s3 bucket. I tried this:

aws s3 cp "s:\path\..\..\" s3://path/ --recursive --include "201512"

and I get this error "TOO FEW ARGUMENTS"

Nearest I can guess it's mad I'm not putting a specific file to send up, but I don't want to do that, I want to automate all things.

If someone could please shed some light on what I'm missing I would really appreciate it.

Thank you

Upvotes: 4

Views: 12207

Answers (6)

Mass Dot Net
Mass Dot Net

Reputation: 2259

I ran into this same problem recently, and quiver's answer -- replacing single backslashes with double backslashes -- resolved the problem I was having.

Here's the Powershell code I used to address the problem, using the OP's original example:

# Notice how my path string contains a mixture of single- and double-backslashes
$folderPath = "c:\\a\a.txt"
echo "`$folderPath = $($folderPath)"

# Use the "Resolve-Path" cmdlet to generate a consistent path string.
$osFolderPath = (Resolve-Path $folderPath).Path
echo "`$osFolderPath = $($osFolderPath)"

# Escape backslashes in the path string.
$s3TargetPath = ($osFolderPath -replace '\\', "\\")
echo "`$s3TargetPath = $($s3TargetPath)"

# Now pass the escaped string to your AWS CLI command.
echo "AWS Command = aws s3 cp `"s3://path/`" `"$s3TargetPath`""

Upvotes: 0

Casey Cotita
Casey Cotita

Reputation: 61

In case this is useful for anyone else coming after me: Add some extra spaces between the source and target. I've been beating my head against running this command with every combination of single quotes, double quotes, slashes, etc:

aws s3 cp /home/<username>/folder/ s3://<bucketID>/<username>/archive/ --recursive --exclude "*" --include "*.csv"

And it would give me: "aws: error: too few arguments" Every. Single. Way. I. Tried.

So finally saw the --debug option in aws s3 cp help so ran it again this way:

aws s3 cp /home/<username>/folder/ s3://<bucketID>/<username>/archive/ --recursive --exclude "*" --include "*.csv" --debug

And this was the relevant debug line:

MainThread - awscli.clidriver - DEBUG - Arguments entered to CLI: ['s3', 'cp', 'home/<username>/folder\xc2\xa0s3://<bucketID>/<username>/archive/', '--recursive', '--exclude', '*', '--include', '*.csv', '--debug']

I have no idea where \xc2\xa0 came from in between source and target, but there it is! Updated the line to add a couple extra spaces and now it runs without errors:

aws s3 cp /home/<username>/folder/   s3://<bucketID>/<username>/archive/ --recursive --exclude "*" --include "*.csv" 

Upvotes: 5

yaaagy
yaaagy

Reputation: 1

I faced the same situation. Let share two scenarios I had tried to check the same code.

  1. Within bash please make sure you have AWS profile in place (use $aws configure). Also, make sure you use a proper proxy if applicable.

$aws s3 cp s3://bucket/directory/ /usr/home/folder/ --recursive --region us-east-1 --profile yaaagy

it worked.

  1. Within a perl script

    $cmd="$aws s3 cp s3://bucket/directory/ /usr/home/folder/ --recursive --region us-east-1 --profile yaaagy"

I enclosed it within "" and it was successful. Let me know if this works out for you.

Upvotes: 0

Harshavardhana
Harshavardhana

Reputation: 1428

Alternatively you can try 'mc' which comes as single binary is available for windows both 64bit and 32bit. 'mc' implements mirror, cp, resumable sessions, json parseable output and more - https://github.com/minio/mc

Upvotes: 3

quiver
quiver

Reputation: 4470

aws s3 cp "s:\path\..\..\" s3://path/ --recursive --include "201512" TOO FEW ARGUMENTS

This is because, in you command, double-quote(") is escaped with backslash(\), so local path(s:\path\..\..\) is not parsed correctly.

What you need to do is to escape backslash with double backslashes, i.e. :

aws s3 cp "s:\\path\\..\\..\\" s3://path/ --recursive --include "201512"

Upvotes: 4

Mark B
Mark B

Reputation: 200446

Use aws s3 sync instead of aws s3 cp to copy the contents of a directory.

Upvotes: 0

Related Questions