Reputation: 2912
I followed this link to try to SSH to my server in Gitlab-CI. For the SSH keys, I went into the server, and generate the public & private keys. Private key is extracted into GitLab CI/CD env variables.
YAML template is as below, copied mostly from the link.
image: docker:19.03.8
services:
- docker:19.03.8-dind
deployment:
variables:
ip: <ip-address>
script:
- apk add --update openssh-client sshpass
- eval $(ssh-agent -s)
- echo "$SSH_PRIVATE_KEY" | ssh-add - > /dev/null
- mkdir -p ~/.ssh
- chmod 700 ~/.ssh
- export SSHPASS=$AWS_PASSWORD
- sshpass -e ssh -o StrictHostKeyChecking=no -vvv ubuntu@$ip echo testing
However, I encountered an error on trying to access the private key.
debug1: Authentications that can continue: publickey,password
debug1: Trying private key: /root/.ssh/id_rsa
debug3: no such identity: /root/.ssh/id_rsa: No such file or directory
debug1: Trying private key: /root/.ssh/id_dsa
debug3: no such identity: /root/.ssh/id_dsa: No such file or directory
debug1: Trying private key: /root/.ssh/id_ecdsa
debug3: no such identity: /root/.ssh/id_ecdsa: No such file or directory
debug1: Trying private key: /root/.ssh/id_ed25519
debug3: no such identity: /root/.ssh/id_ed25519: No such file or directory
debug1: Trying private key: /root/.ssh/id_xmss
debug3: no such identity: /root/.ssh/id_xmss: No such file or directory
debug2: we did not send a packet, disable method
debug3: authmethod_lookup password
debug3: remaining preferred: ,password
debug3: authmethod_is_enabled password
debug1: Next authentication method: password
debug3: send packet: type 50
debug2: we sent a password packet, wait for reply
debug3: receive packet: type 51
debug1: Authentications that can continue: publickey,password
Permission denied, please try again.
I am using gitlab shared runners, if that helps.
[Update]
Forgot to add that in the server that I want to connect, I added the public keys I generated id_rsa.pub
into the authorized_keys
files.
[Edit 1]
As suggested, I have added the known hosts using ssh-keyscan to copy the output as a variable $SSH_KNOWN_HOSTS. Below the updated yaml file. However I encountered the same error.
deployment:
variables:
ip: <ip-address>
script:
- apk add --update openssh-client sshpass
- eval $(ssh-agent -s)
- echo "$SSH_PRIVATE_KEY" | ssh-add - > /dev/null
- mkdir -p ~/.ssh
- chmod 700 ~/.ssh
- touch ~/.ssh/known_hosts
- echo "$SSH_KNOWN_HOSTS" >> ~/.ssh/known_hosts
- chmod 644 ~/.ssh/known_hosts
- export SSHPASS=$AWS_PASSWORD
- sshpass -e ssh -o StrictHostKeyChecking=no -vvv ubuntu@$ip echo testing
Upvotes: 3
Views: 4761
Reputation: 6629
I'm not sure about sshpass
, since I usually use public/private keys. Here's an example of a job I would setup to run SCP
/SSH
commands on remote servers:
deploy:
stage: deploy
variables:
hostname: app-dev
before_script:
# optional step if you decide to use a hostname instead of IP address
- cp -f ./network/etc/hosts /etc/hosts
# Setup SSH
- which ssh-agent || ( apt-get update -y && apt-get install openssh-client -y )
- eval $(ssh-agent -s)
- ssh-add <(cat $SSH_PRIVATE_KEY)
- mkdir -p ~/.ssh
- chmod 700 ~/.ssh
- ssh-keyscan $HOSTNAME >> ~/.ssh/known_hosts
- chmod 644 ~/.ssh/known_hosts
script:
# Copy files and execute commands
- scp ./scripts/install_package.sh root@$HOSTNAME:/tmp/deploy
- ssh root@$HOSTNAME "/tmp/deploy/install_package.sh && exit"
Before running the pipeline, you need to do the following:
ssh-keygen
. Don't use a passphrase. Public key ends in .pub
, private key has no extension.~/.ssh/authorized_keys
SSH_PRIVATE_KEY
$HOSTNAME
environment variable, define the variable in your pipeline and add the IP/hostname to the /etc/hosts
file in your pipeline container. Otherwise, just use an IP address instead.Upvotes: 5