Reputation: 2812
Hi need to transfer a file to ec2 machine via ssm agent. I have successfully installed ssm-agent in ec2 instances and from UI i am able to start session via "session-manager" and login to the shell of that ec2 machine.
Now I tried to automate it via boto3 and using the below code,
ssm_client = boto3.client('ssm', 'us-west-2')
resp = client.send_command(
DocumentName="AWS-RunShellScript", # One of AWS' preconfigured documents
Parameters={'commands': ['echo "hello world" >> /tmp/test.txt']},
InstanceIds=['i-xxxxx'],
)
The above works fine and i am able to send create a file called test.txt in remote machine but his is via echo command Instead I need to send a file from my local machine to this remove ec2 machine via ssm agent, hence I did the following ,
Modified the "/etc/ssh/ssh_config" with proxy as below,
# SSH over Session Manager
host i-* mi-*
ProxyCommand sh -c "aws ssm start-session --target %h --document-name AWS-StartSSHSession --parameters 'portNumber=%p'"
Then In above code, I have tried to start a session with below code and that is also successfully .
response = ssm_client.start_session(Target='i-04843lr540028e96a')
Now I am not sure how to use this session response or use this aws ssm session and send a file
Environment description: Source: pod running in an EKS cluster dest: ec2 machine (which has ssm agent running) file to be transferred: Important private key which will be used by some process in ec2 machine and it will be different for different machine's
Solution tried:
Basically i wanted to achieve scp which is mentioned in this aws document : https://docs.aws.amazon.com/systems-manager/latest/userguide/session-manager-working-with-sessions-start.html#sessions-start-ssh
Upvotes: 8
Views: 57931
Reputation: 830
Even not the OP's requested boto3
approach but here are the vanilla CLI steps:
ubuntu
/ec2-user
~/.ssh/authorized_keys
and add a line for your personal SSH key. Make sure the permissions are 0600
.~/.ssh/config
and add something likehost i-* mi-*
ProxyCommand sh -c "aws ssm start-session --target %h --document-name AWS-StartSSHSession --parameters 'portNumber=%p'"
export the required environment variables, e.g. export AWS_PROFILE=your-profile
and export AWS_REGION=eu-central-1
.
verify you can log in via SSH by running ssh i-0123456789 -l ubuntu
tar.gz the content up to make it faster as the transfer is slow.
SCP the archive of the machine: scp ubuntu@i-0123456789:/home/ubuntu/content.tar.gz ~/Downloads/
Upvotes: 0
Reputation: 41
If you have ssm already setup, why do you need to use boto3 and send-file?
aws ssm start-session --target i-xxxx \
--document-name AmazonEKS-ExecuteNonInteractiveCommand \
--parameters 'command="cat remotefile"' | tail -n +3 | head -n -3 > file
Where i-xxx
is your instance ID, remotefile
is the name of the remote file and file
is the name it will be called when you get it. Obviously ssm-user
will need to be able to read remotefile
. If not, you could probably add some sudo
magic in there too.
No need to setup ssh keys or S3 buckets or ...whatever.....
The tail -n +3| head -n -3
is needed because I can't see a way to persuade ssm not to print a blank line and "Starting session..." or "Exiting session..." on each connection.
That only works up to about 250kb filesize for me. I don't know if it was corrupting binary or not because my file was bigger.
Alternative:
aws ssm start-session --target i-xxxx \
--document-name AWSFleetManager-GetFileContent \
--parameters 'Path=remotefile,PrintInteractiveStatements=No' | tail -n +4 | head -n -3 > file
Only works for text files I think. It nuked my tar file. It might be just doing a unix2dos conversion on the way through.
Ran the file through base64 first and it was happy.
NBs:
Upvotes: 4
Reputation: 81
The answer given by @Nathan Williams is confusing (scp file.txt ec2-user@i-04843lr540028e96a
).
When using scp you use SSH as protocol to copy, so you have to setup a username/password or user name ssh keys to copy a file.
This scp
command doesn't work unless you share keys, specify a region and a profile. The complete command would be something like:
scp -i keyfile file.txt ec2-user@i-04843lr540028e96a --region xxx --profile myprofile
If you configured your default profile and region you don't have to put them in the command.
I think that what most people is looking for is just an easy way to transfer a file to an ec2 instance only using SSM, like ssm cp file instance-name
, which as far as I could research doesn't exist.
@Matteo statement is correct, why do I need ssh keys if the whole point of SSM is to get rid of it? Basically what you do is to use SSM as a kind of proxy so you can reach you EC2 machines without having to specify the actual IP address, (may be the instance doesn't have a public IP, or it does and in that case you have to whitelist your source IP in a Security Group), so you reach port 22 of the EC2 instance over SSM by just specifying the i-id but you authenticate over SSH (key and user).
SCP works over SSH so you still need a KEY to use it. Again I think that most people was expecting was just plain ssm cp file instance
.
Upvotes: 8
Reputation: 1568
You can use an S3 bucket as proxy. The only thing required is to give EC2 permission to access the S3. This way you don't have to use a SSH protocol to copy files between machines.
Upvotes: 2
Reputation: 792
If you have SSH over SSM setup, you can just use normal scp, like so:
scp file.txt ec2-user@i-04843lr540028e96a
If it isn't working, make sure you have:
If you need more detail, give me more info on your setup, and I'll try and help.
Upvotes: 5