DMCoding
DMCoding

Reputation: 1237

Copy files from remote host to vagrant instance using ansible

Similar questions have been asked before, but none have been answered or are specific to Vagrant.

I have a directory on host master which I would like to synchronize with my vagrant instance. Here's my playbook:

- hosts: master
  vars:
    backup_dir: /var/backups/projects/civi.common.scot/backups/latest/
    dest_dir: /var/import
  tasks:
    - name: Synchronize directories
      synchronize:
        src: "{{ backup_dir }}"
        dest: "{{ dest_dir }}"
        mode: pull
      delegate_to: default

Here is my inventory:

default ansible_host=192.168.121.199 ansible_port=22  ansible_user='vagrant' ansible_ssh_private_key_file='/run/media/daniel/RAIDStore/Workspace/docker/newhume/.vagrant/machines/default/libvirt/private_key'
master ansible_host=hume.common.scot

When I run this play, the process does not seem to copy any files to disk, but does not error or quit either.

With ssh.config.forward_agent = true in my Vagrantfile, I am able to issue the following command from the Vagrant guest:

rsync --rsync-path='sudo rsync' -avz -e ssh $remote_user@$remote_host:$remote_path $local_path`

However, the following playbook does not work (same problem as when using the synchronize module):

- name: synchronize directories (bugfix for above)
  command: "rsync --rsync-path='sudo rsync' -avz -e ssh {{ remote_user }}@{{ remote_host }}:{{ backup_directory }} {{ dest_dir }}"

I have also tried using shell instead of command.

How can I copy these files to my vagrant instance?

Upvotes: 2

Views: 1281

Answers (1)

iptizer
iptizer

Reputation: 1218

The "syschronize" Ansible module "is run and originates on the local host where Ansible is being run" (quote from manpage). So it is for copy from local to remote. What you want to do, is to copy from remote A (master) to remote B (default). To acomplish that you'd have to exchange ssh keys for a specific user from B to A and vice versa for known_hosts. The following should guide you through the process:

- hosts: default
  tasks:
    # transfer local pub-key to remote authorized_keys
    - name: fetch local ssh key from root user
      shell: cat /root/.ssh/id_rsa.pub
      register: ssh_keys
      changed_when: false
    - name: deploy ssh key to remote server
      authorized_key:
              user: "root"
              key: "{{ item }}"
      delegate_to: "master"
      with_items:
              - "{{ ssh_keys.stdout }}"

    # fetch remote host key and add to local known_hosts
    # to omit key accept prompt
    - name: fetch ssh rsa host key from remote server
      shell: cat /etc/ssh/ssh_host_rsa_key.pub
      register: ssh_host_rsa_key
      delegate_to: master
      changed_when: false
    - name: create /root/.ssh/ if not existant
      file:
          path: "/root/.ssh/"
          owner: root
          group: root
          mode: 0700
          state: directory
    - name: add hostkey to root known host file
      lineinfile:
          path: "/root/.ssh/known_hosts"
          line: "{{ master.fqdn }} {{ ssh_host_rsa_key.stdout }}"
          mode: 0600
          create: yes
          state: present
      with_items:
          - "{{ ssh_keys.stdout }}"

    # now call rsync to fetch from master
    - name: fetch from remote
      shell: rsync --rsync-path='sudo rsync' -avz -e ssh root@{{ master.fqdn }}:{{ backup_directory }} {{ dest_dir }}

Upvotes: 1

Related Questions