Saita
Saita

Reputation: 1044

Jenkins pipeline - use ssh agent to clone a repository in another machine through ssh

Use case: I have a Jenkins pipeline to update my development environment. My dev env is an EC2 aws instance with docker compose.

The automation was written along the lines of:

withAWS(profile: 'default') {
   sh "ssh -o StrictHostKeyChecking=no -i ~/my-key.pem user@$123.456.789 /bin/bash -c 'run some command like docker pull'"
}

Now, I have other test environments, and they all have some sort of docker-compose file, configurations and property files that requires me to go over all of them when something needs to change.

To help with that, I created a new repository to keep all the different environment configurations, and my plan is to have a clone of this repo in all my development and test environments, so when I need to change something, I can just do it locally, push it, and have my jenkins pipeline update the repository in whichever environment it is updating.

My jenkins has a ssh credential for my repo (it uses in another job that clones the repo and run tests on source code), so I want to use that same credential.

Question: can I somehow, through ssh'ing into another machine, use Jenkins ssh-agent credentials to clone/update a bitbucket repository?

Edit: I changed the pipeline to:

script {
   def hgCommand = "hg clone ssh://[email protected]/my-repo"
   sshagent(['12345']) {
     sh "ssh -o StrictHostKeyChecking=no -i ~/mykey.pem admin@${IP_ADDRESS} /bin/bash -c '\"${hgCommand}\"'"
   }
}

And I am getting:

[ssh-agent] Looking for ssh-agent implementation...
[ssh-agent]   Exec ssh-agent (binary ssh-agent on a remote machine)
$ ssh-agent
SSH_AUTH_SOCK=/tmp/ssh-FOburguZZlU0/agent.662
SSH_AGENT_PID=664
Running ssh-add (command line suppressed)
Identity added: /home/jenkins/workspace/abc@tmp/private_key_12345.key (rsa w/o comment)
[ssh-agent] Started.
[Pipeline] {
[Pipeline] sh
[test-env-config] Running shell script
+ ssh -o StrictHostKeyChecking=no -i /home/jenkins/mykey.pem [email protected] /bin/bash -c "hg clone ssh://[email protected]/my-repo"
remote: Warning: Permanently added the RSA host key for IP address '765.432.123' to the list of known hosts.
remote: Permission denied (publickey).
abort: no suitable response from remote hg!
$ ssh-agent -k
unset SSH_AUTH_SOCK;
unset SSH_AGENT_PID;
echo Agent pid 664 killed;
[ssh-agent] Stopped.

Upvotes: 1

Views: 3596

Answers (1)

marco.m
marco.m

Reputation: 4849

First some background to understand the reasoning (this is pure ssh, no Jenkins or Mercurial specific): the ssh-agent utility works by making a UNIX domain socket to be then used by ssh. The ssh command attempts to communicate with the agent if it finds the the environment variable SSH_AUTH_SOCK. In addition, ssh can be instructed to forward the agent, via -A. For more details, see the man pages of ssh-agent and ssh.

So, assuming that your withAWS context makes the environment variable SSH_AUTH_SOCK (set by the plugin) available, I think it should be enough to:

  • add -A to your ssh invocation
  • in the part 'run some command like docker pull', add the hg clone command, ensuring you are using the ssh:// schema for the mercurial URL.

Security observation: -o StrictHostKeyChecking=no should be used as a last resort. From your example, the IP address of the target is fixed, so you should do the following:

  • remove the -o StrictHostKeyChecking=no
  • one-shot: get the host fingerprint of 123.456.789 (for example by ssh-ing into it and then looking for the associated line in your $HOME/.known_hosts). Save that line in a file, say 123.456.789.fingerpint
  • make the file 123.456.789.fingerprint available to Jenkins when it is invoking your sample code. This can be done by committing that file in the repo that contains the Jenkins pipeline, it is safe to do so since it doesn't contain secrets.
  • Finally, change your ssh invocation to something like ssh -o UserKnownHostsFile=/path/to/123.456.789.fingerprint ...

Upvotes: 2

Related Questions