beydogan
beydogan

Reputation: 1110

Ansible copy ssh key from one host to another

I have 2 app servers with a loadbalancer in front of them and 1 database server in my system. I'm provisioning them using Ansible. App servers has Nginx + Passenger and running for a Rails app. Will use capistrano for deployment but I have an issue about ssh keys. My git repo is in another server and I have to generate ssh public keys on appservers and add them to the Git server(To authorized_keys file). How can I do this in ansible playbook?

PS: I may have more than 2 app servers.

enter image description here

Upvotes: 49

Views: 91713

Answers (7)

veedub
veedub

Reputation: 1

This is what I use to exchange RSA keys between multiple hosts (many to many). I have variations that create the user accounts with the key pairs and also to deal with 'one to many' and 'many to one' scenarios.

#:TASK: Exchange SSH RSA keys between multiple hosts (many to many)
#:....: RSA keypairs are created as required at play (1)
#:....: authorized_keys updated at play <root user (2a.1 & 2a.2)>, <non root user (2b.1)>
#:....: -- We need a 2a or 2b option becasue there is a 'chicken & egg' issue for the root user!
#:....: known_hosts files are updated at play (3)
#:REQD: *IF* your security policy allows:
#:....: -- Add 'host_key_checking = False' to ansible.cfg
#:....: -- Or use one of the variations of 'StrictHostKeyChecking=no' elsewhere:
#:....: e.g. inventory setting - ansible_ssh_common_args='-o StrictHostKeyChecking=no'
#:....: - or - host variable - ansible_ssh_extra_args='-o StrictHostKeyChecking=no'
#:USER: RUN this as the 'root' user; it hasn't been tested or adapted to be run as any other user
#:EXEC: ansible-playbook <playbook>.yml -e "nodes=<inventory_hosts> user=<username>"
#:VERS: 20230119.01
#
---
- name: Exchange RSA keys and update known_hosts between multiple hosts
  hosts: "{{ nodes }}"
  vars:
    ip: "{{ hostvars[inventory_hostname]['ansible_default_ipv4']['address'] }}"
  tasks:
    - name: (1) Generate an SSH RSA key pair
      community.crypto.openssh_keypair:
        path: "~{{ user }}/.ssh/id_rsa"
        comment: "{{ user }}@{{ ip }}"
        size: 2048

    - name: (2) Retrieve RSA key/s then exchange it with other hosts
      block:
        - name: (2a.1) Retrieve client public RSA key/s to a variable
          slurp:
            src: ".ssh/id_rsa.pub"
          register: rsa_key

          # Using the debug module here seems to make the slurp above more reliable
          # as during testing not all hosts that were slurped worked.
        - debug:
            msg: "{{ rsa_key['content'] | b64decode }} / {{ ip }} / {{ user }}"

        - name: (2a.2) Exchange RSA keys between hosts and update authorized_key files
          delegate_to: "{{ item }}"
          authorized_key:
            user: "{{ user }}"
            key: "{{ rsa_key['content'] | b64decode }}"
          with_items:
            - "{{ ansible_play_hosts }}"
          when: item != inventory_hostname
      when: user == "root"

    - name: (2b.1) Exchange RSA keys between hosts and update authorized_key files
      block:
        - delegate_to: "{{ item }}"
          authorized_key:
            user: "{{ user }}"
            key: "{{ rsa_key['content'] | b64decode }}"
          with_items:
            - "{{ ansible_play_hosts }}"
          when: item != inventory_hostname
      when: user != "root"

    - name: (3) Ensure nodes are present in known_hosts file
      become: yes
      become_user: "{{ user }}"
      known_hosts:
        name: "{{ item }}"
        path: "~{{ user }}/.ssh/known_hosts"
        key: "{{ lookup('pipe', 'ssh-keyscan -t rsa {{ item }}') }}"
      when: item != inventory_hostname
      with_items:
        - "{{ ansible_play_hosts }}"

Upvotes: 0

Juan Islas
Juan Islas

Reputation: 1

I wanted to contribute this code by removing the shell module and using slurp. Thanks a lot Jonas Libbrecht for the code. It is quite useful.

- name: Get ssh keys
  slurp:
    src: /home/nsbl/.ssh/id_ed25519.pub
  register: ssh_keys
  tags:
    - ssh

- name: Check keys
  debug: msg="{{ ssh_keys['content'] | b64decode }}"
  tags:
    - ssh

- name: deploy keys on nodes 1
  authorized_key: 
    user: root 
    key: "{{ item[1]  }}"
  delegate_to: "{{ item[0] }}"
  with_nested:
    - "{{ groups['cluster'] }}"
    - "{{ ssh_keys['content'] | b64decode }}"
  tags:
    - ssh

Thanks community.

Upvotes: -1

MUY Belgium
MUY Belgium

Reputation: 2452

Use the openssh_keypair and authorized_key module to create and deploy the keys at the same time without saving it into your ansible host.

- openssh_keypair:
    group: root
    owner: root
    path: /some/path/in/your/server
    register: ssh_key

- name: Store public key into origin
  delegate_to: central_server_name
  authorized_key:
     key: "{{ssh_key.public_key}}"
     comment: "{{ansible_hostname}}"
     user: any_user_on_central

Will create and/or make sure the ssh key on your server will enable ssh connection to central_server_name.

Upvotes: 0

Nicholas Sushkin
Nicholas Sushkin

Reputation: 13780

I created a parameterized role to make sure ssh key pair is generated in a source user in a source remote host and its public key copied to a target user in a target remote host.

You can invoke that role in a nested loop of source and target host lists as shown at the bottom:

---
#****h* ansible/ansible_roles_ssh_authorize_user
# NAME
#   ansible_roles_ssh_authorize_user - Authorizes user via ssh keys
#
# FUNCTION
#
#   Copies user's SSH public key from a source user in a source host
#   to a target user in a target host
#
# INPUTS
#
#   * ssh_authorize_user_source_user
#   * ssh_authorize_user_source_host
#   * ssh_authorize_user_target_user
#   * ssh_authorize_user_target_host
#****
#****h* ansible_roles_ssh_authorize_user/main.yml
# NAME
#   main.yml - Main playbook for role ssh_authorize_user
# HISTORY
#   $Id: $
#****

- assert:
    that:
      - ssh_authorize_user_source_user != ''
      - ssh_authorize_user_source_host != ''
      - ssh_authorize_user_target_user != ''
      - ssh_authorize_user_target_host != ''
  tags:
    - check_vars
- name: Generate SSH Keypair in Source
  user:
    name: "{{ ssh_authorize_user_source_user }}"
    state: present
    ssh_key_comment: "ansible-generated for {{ ssh_authorize_user_source_user }}@{{ ssh_authorize_user_source_host }}"
    generate_ssh_key: yes
  delegate_to: "{{ ssh_authorize_user_source_host }}"
  register: source_user
- name: Install SSH Public Key in Target
  authorized_key:
    user: "{{ ssh_authorize_user_target_user }}"
    key: "{{ source_user.ssh_public_key }}"
  delegate_to: "{{ ssh_authorize_user_target_host }}"
- debug:
    msg: "{{ ssh_authorize_user_source_user }}@{{ ssh_authorize_user_source_host }} authorized to log in to {{ ssh_authorize_user_target_user }}@{{ ssh_authorize_user_target_host }}"

Invoking role in a loop:

- name: Authorize User
  include_role:
    name: ssh_authorize_user
  vars:
    ssh_authorize_user_source_user: "{{ git_user }}"
    ssh_authorize_user_source_host: "{{ item[0] }}"
    ssh_authorize_user_target_user: "{{ git_user }}"
    ssh_authorize_user_target_host: "{{ item[1] }}"
  with_nested:
    - "{{ app_server_list }}"
    - "{{ git_server_list }}"

Upvotes: 5

Jonas Libbrecht
Jonas Libbrecht

Reputation: 777

This does the trick for me, it collects the public ssh keys on the nodes and distributes it over all the nodes. This way they can communicate with each other.

- hosts: controllers
  gather_facts: false
  remote_user: root
  tasks:
    - name: fetch all public ssh keys
      shell: cat ~/.ssh/id_rsa.pub
      register: ssh_keys
      tags:
        - ssh

    - name: check keys
      debug: msg="{{ ssh_keys.stdout }}"
      tags:
        - ssh

    - name: deploy keys on all servers
      authorized_key: user=root key="{{ item[0] }}"
      delegate_to: "{{ item[1] }}"
      with_nested:
        - "{{ ssh_keys.stdout }}"
        - "{{groups['controllers']}}"
      tags:
        - ssh

Info: This is for the user root

Upvotes: 37

Take a look to the authorized_key module for getting info on how to manage your public keys.

The most straightforward solution I can think of would be to generate a fresh key pair for your application, to be shared accross all your app instances. This may have security implications (you are indeed sharing keys between all instances!), but it'll simplify a lot the provisioning process.

You'll also require a deploy user on each app machine, to be used later on during deployment process. You'll need your public key (or jenkins one) on each deploy user's authorized_keys.

A sketch playbook:

---
- name: ensure app/deploy public key is present on git server
  hosts: gitserver
  tasks:
    - name: ensure app public key
      authorized_key: 
        user: "{{ git_user }}" 
        key: app_keys/id_dsa.pub 
        state: present

- name: provision app servers
  hosts: appservers
  tasks:
    - name: ensure app/deploy user is present
      user: 
        name: "{{ deploy_user }}"
        state: present

    - name: ensure you'll be able to deploy later on
      authorized_key:
        user: "{{ deploy_user }}" 
        key: "{{ path_to_your_public_key }}" 
        state: present

    - name: ensure private key and public one are present
      copy: 
        src: keys/myapp.private 
        dest: "/home/{{ deploy_user }}/.ssh/{{ item }}" 
        mode: 0600
      with_items:
        - app_keys/id_dsa.pub
        - app_keys/id_dsa

Upvotes: 32

jarv
jarv

Reputation: 5586

I would create a deploy user that is restricted to pull access to your repos. You can either allow this through http or there are a few options to do it over ssh.

If you don't care about limiting the user to read-only access to your repo then you can create a normal ssh user. Once the user is created you can use Ansible to add the user's public key to the authorized key file on the git server you can use the authorized key module.

Once that is setup you have two options:

  1. If you use ssh use ssh key forwarding so that the user that is used to run the Ansible task sends his public key to the dev server.

  2. Temporarily transfer the key and use the ssh_opts git module option to use the deploy user's public key.

Upvotes: 1

Related Questions