Angelo Farina
Angelo Farina

Reputation: 31

Create and setup GCP VM's with ansible, ssh Permission denied (publickey)

Before executing the playbook i have created a service account and given it the permissions for "Compute admin", "OS Login admin" and "service account user". Then i downloaded the json key on my machine. The service account state is "active". On my machine i wrote a playbook to set up one gcp VM and install apache and copy there a dummy webpage.

- name: Create Compute Engine instances
      hosts: localhost
      gather_facts: no
      vars:
        gcp_project: ansible-xxxxxx
        gcp_cred_kind: serviceaccount
        gcp_cred_file: ~/ansible-key.json
        zone: "us-central1-a"
        region: "us-central1"
        machine_type: "n1-standard-1"
        image: "projects/ubuntu-os-cloud/global/images/family/ubuntu-1604-lts"

      tasks:
        - name: Create an IP address for instance
          gcp_compute_address:
            name: "{{ zone }}-ip"
            region: "{{ region }}"
            project: "{{ gcp_project }}"
            service_account_file: "{{ gcp_cred_file }}"
            auth_kind: "{{ gcp_cred_kind }}"
          register: gce_ip
        - name: Bring up the instance in the zone.
          gcp_compute_instance:
            name: "{{ zone }}"
            machine_type: "{{ machine_type }}"
            disks:
              - auto_delete: true
                boot: true
                initialize_params:
                  source_image: "{{ image }}"
            network_interfaces:
              - access_configs:
                  - name: External NAT
                    nat_ip: "{{ gce_ip }}"
                    type: ONE_TO_ONE_NAT
            tags:
              items:
                - http-server
                - https-server
            zone: "{{ zone }}"
            project: "{{ gcp_project }}"
            service_account_file: "{{ gcp_cred_file }}"
            auth_kind: "{{ gcp_cred_kind }}"
          register: gce

...after instantiating the VM i connect to it via ssh...

post_tasks:
    - name: Wait for SSH for instance
      wait_for: delay=5 sleep=5 host={{ gce_ip.address }} port=22 state=started timeout=100
    - name: Save host data for first zone
      add_host: hostname={{ gce_ip.address }} groupname=gce_instances_ips

the ansible-playbook never passes this step, to call it i use ansible-playbook main.yaml --user sa_123456789 and the given error is either a

fatal: [130.211.225.130]: UNREACHABLE! => {"changed": false, "msg": "Failed to connect to the host via ssh: [email protected]: Permission denied (publickey).", "unreachable": true}

or a simple timeout

fatal: [localhost]: FAILED! => {"changed": false, "elapsed": 105, "msg": "Timeout when waiting for 130.211.225.130:22"}

In the metadata of GCE I also set enable-oslogin to TRUE. The VM is created without any problem and is accessible by using the GCP console (GUI). If I try to access via ssh with keys generated privately the machine seems to be unreachable. Does anyone have experience with this type of error?

Upvotes: 3

Views: 2519

Answers (1)

Airus
Airus

Reputation: 35

This error usually occurs when there is no valid public and private key generated and setup.

Try any of the following approaches:

  1. Create/edit your ansible.cfg file in your playbook directory and add a line for the full path of your key:

    [defaults]
    privatekeyfile = /Users/username/.ssh/private_key    
    

    It sets private key globally for all hosts in your playbook.

  2. Add the private key to your playbook using the following line:

    vars:
      ansible_ssh_private_key_file: "/home/ansible/.ssh/id_rsa"
    
  3. You can also define the private key to use directly in command line:

    ansible-playbook -vvvv --private-key=/Users/you/.ssh/your_key playbookname.yml
    

Upvotes: 1

Related Questions