Reputation: 23492
I'm creating a CI pipeline and development environment with Vagrant, Ansible and Docker. My goal is to have everything automated with a single command, no manual configuration involved. With single ansible-playbook command I should have fully functional continuous deployment pipeline, all the services dockerized.
Now here's the problem. When I run the official Jenkins docker container and try to configure authentication for git, I get the following error
host key verification failed
I understand I could login into Jenkins container, ssh to git manually and accept the host key as trusted, when login for the first time. But this is absolute no-no, the connectivity should be handled automatically too.
How do I configure Jenkins docker container to trust the git server at creation, when the available tools are docker, ansible and vagrant?
Upvotes: 5
Views: 3383
Reputation: 127
I'm building something similar with my pipeline stages encapsulated in containers orchestrated by Kubernetes and I'm able to source modules from my private bitbucket server using the ssh-agent Jenkins plug-in inside hashicorp/terraform:light
image based container via git+ssh seamlessly. I faced the same issue as yours from the ansible/ansible-runner
image when I tried to download my roles via ansible-galaxy
from the same bitbucket server.
I tried to do the same as with terraform and ssh-agent
My relevant pipeline snippet looks like this:
container('ansible') {
...
sshagent([ssh_key]) {
...
stage('get ansible roles') {
sh 'ansible-galaxy install -r requirements.yaml -p roles/'
...
}
}
}
It failed and ansible-galaxy
is actually hiding the problem pretty well:
+ ansible-galaxy install -r requirements.yaml -p roles/
[WARNING]: - ans_rol_test was NOT installed successfully: - command
/usr/bin/git clone ssh://[email protected]/project/ans_rol_test.git
ans_rol_test failed in directory /root/.ansible/tmp/ansible-local-
106DvbAa0/tmp09xwe_ (rc=128)
ERROR! - you can use --ignore-errors to skip failed roles and finish processing the list.
After I saw this is just a plain git clone, I tried to clone a repository from the pipeline:
+ /usr/bin/git clone ssh://[email protected]/project/ans_rol_test.git
Cloning into 'ans_rol_test'...
Host key verification failed.
fatal: Could not read from remote repository.
Please make sure you have the correct access rights
and the repository exists.
Then I tried just ssh into the bitbucket server.
+ ssh [email protected]
Pseudo-terminal will not be allocated because stdin is not a terminal.
Host key verification failed.
I realized when I ssh via the -oStrictHostKeyChecking=no
the host key is saved anyways but the ssh client returns with 255
because of sshd and the pipeline fails so I've put a || true
at the end.
Pseudo-terminal will not be allocated because stdin is not a terminal.
Warning: Permanently added 'mybitbucketserver.org,10.5.132.51' (RSA) to the list of known hosts.
shell request failed on channel 0
+ true
After this the host key is 'verified' so git clone ssh://
works therefore ansible-galaxy
too.
...
stage('get ansible roles') {
sh 'ssh -oStrictHostKeyChecking=no [email protected] || true'
sh 'ansible-galaxy install -r requirements.yaml -p roles/'
...
}
...
output:
+ ssh -oStrictHostKeyChecking=no [email protected]
Pseudo-terminal will not be allocated because stdin is not a terminal.
Warning: Permanently added 'mybitbucketserver.org,10.5.132.51' (RSA) to the list of known hosts.
shell request failed on channel 0
+ true
[Pipeline] sh
+ /usr/bin/git clone ssh://[email protected]/project/ans_rol_test.git
Cloning into 'ans_rol_test'...
[Pipeline] sh
+ ansible-galaxy install -r requirements.yaml -p roles/
- extracting ans_rol_test to /home/jenkins/agent/workspace/configuration/roles/ans_rol_test
- ans_rol_test (1.0.0) was installed successfully
Worth noting that setting GIT_SSH_COMMAND
environment variable with
"ssh -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no"
is not working.
Upvotes: 5
Reputation: 54447
You can use Ansible's known_hosts module for solving this problem.
This module adds the host key into the server's ~/.ssh/known_hosts
file, similar to what you describe as a manual step.
Please note the limitations of the module as well:
If you have a very large number of host keys to manage, you will find the template module more useful.
Upvotes: 0