Jim
Jim

Reputation: 14300

Getting Ansible "Permission denied (publickey,password)" on multiple VMs

I'm getting the following error when I try to run very simple playbook with the command "ansible-playbook site.yml -vvvv" against two Vagrant virtual machines but I'm not sure how to resolve it.

PLAY [Configure servers] **************************************** 

GATHERING FACTS *************************************************************** 
<dev.db> ESTABLISH CONNECTION FOR USER: vagrant
<dev.db> REMOTE_MODULE setup
<dev.db> EXEC ssh -C -tt -vvv -o ControlMaster=auto -o ControlPersist=60s -o ControlPath="/Users/flaugher/.ansible/cp/ansible-ssh-%h-%p-%r" -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=vagrant -o ConnectTimeout=10 dev.db /bin/sh -c 'mkdir -p $HOME/.ansible/tmp/ansible-tmp-1455651230.31-78392827258464 && chmod a+rx $HOME/.ansible/tmp/ansible-tmp-1455651230.31-78392827258464 && echo $HOME/.ansible/tmp/ansible-tmp-1455651230.31-78392827258464'
fatal: [dev.db] => SSH Error: Permission denied (publickey,password).
    while connecting to 192.168.2.102:22
It is sometimes useful to re-run the command using -vvvv, which prints SSH debug output to help diagnose the issue.
<dev.web> ESTABLISH CONNECTION FOR USER: vagrant
<dev.web> REMOTE_MODULE setup
<dev.web> EXEC ssh -C -tt -vvv -o ControlMaster=auto -o ControlPersist=60s -o ControlPath="/Users/flaugher/.ansible/cp/ansible-ssh-%h-%p-%r" -o StrictHostKeyChecking=no -o KbdInteractiveAuthentication=no -o PreferredAuthentications=gssapi-with-mic,gssapi-keyex,hostbased,publickey -o PasswordAuthentication=no -o User=vagrant -o ConnectTimeout=10 dev.web /bin/sh -c 'mkdir -p $HOME/.ansible/tmp/ansible-tmp-1455651230.3-64535332497824 && chmod a+rx $HOME/.ansible/tmp/ansible-tmp-1455651230.3-64535332497824 && echo $HOME/.ansible/tmp/ansible-tmp-1455651230.3-64535332497824'
fatal: [dev.web] => SSH Error: Permission denied (publickey,password).
    while connecting to 192.168.2.101:22
It is sometimes useful to re-run the command using -vvvv, which prints SSH debug output to help diagnose the issue.

TASK: [debug msg="hello, world!"] ********************************************* 
FATAL: no hosts matched or all hosts have already failed -- aborting


PLAY RECAP ******************************************************************** 
           to retry, use: --limit @/Users/smith/site.retry

dev.db                     : ok=0    changed=0    unreachable=1    failed=0   
dev.web                    : ok=0    changed=0    unreachable=1    failed=0   

Here are how my VMs are configured:

Vagrant.configure(2) do |config|
    config.vm.define "web" do |web|
        web.vm.box = "debian/jessie64"
        web.vm.network "private_network", ip: "192.168.2.101"
        web.vm.network :forwarded_port, guest: 22, host: 10122, id: "ssh"
        web.vm.host_name = "dev.web"
    end
    config.vm.define "db" do |db|
        db.vm.box = "debian/jessie64"
        db.vm.network "private_network", ip: "192.168.2.102"
        db.vm.network :forwarded_port, guest: 22, host: 10222, id: "ssh"
        db.vm.host_name = "dev.db"
    end
end

Here's my ansible.cfg file:

[defaults]
hostfile = inventory.ini
remote_user = vagrant
host_key_checking = False
# private_key_file = ???

Here is inventory.ini:

[development]
dev.web
dev.db

And the playbook site.yml:

- name: Configure servers
  hosts: development
  gather_facts: True
  vars:
    foo: "bar"
  tasks:
    - debug: msg="hello, world!"
    - fail:

This seems like a SSH key file problem. My first thought was since there's a private key file for each virtual server:

.vagrant/machines/web/virtualbox/private_key
.vagrant/machines/db/virtualbox/private_key

... perhaps I need to specify multiple private_key_file settings in my config file? However, the Ansible documentation doesn't say this is possible. I was also thinking that perhaps I need separate "[web]" and "[db]" groups in the config file so that I can specify separate key files, but again the Ansible documentation doesn't indicate this is a possibility. The vagrant user on my local machine does have public and private keys in their ~vagrant/.ssh directory, all with the correct permissions. I am able to SSH to each VM using the command "vagrant ssh [web | db]" and vagrant's home directory on each VM has an authorized_keys file in its ~/.ssh directory. Can anyone see what I"m doing wrong?

Thanks!

Upvotes: 0

Views: 4348

Answers (1)

ydaetskcoR
ydaetskcoR

Reputation: 56997

You can specify keys at the inventory level with ansible_ssh_private_key_file.

You can either do this with group_vars or host_vars depending on your use case. In your case you might just want to put them inline in your inventory file like this:

[development]
dev.web ansible_ssh_private_key_file=/path/to/.vagrant/machines/web/virtualbox/private_key
dev.db ansible_ssh_private_key_file=/path/to/.vagrant/machines/db/virtualbox/private_key

Upvotes: 1

Related Questions