Alex Cohen
Alex Cohen

Reputation: 6236

Ansible: How to specify an ssh key for a single task?

I have a playbook that creates and ec2 instance, copies a few files over to the instance and then runs some shell commands on the instance.

The issue is that I want to be able to specify what ssh key ansible uses for the copy and shell tasks I am running and make sure it does not attempt to use this key for the other tasks, which run on the localhost. Here is my playbook:

---
- hosts: localhost
  connection: local
  gather_facts: false

  vars:
    # CentOS 7 x86_64 Devel AtomicHost EBS HVM 20150306_01 (ami-07e6c437)
    # for us-west-2
    - ami: 'ami-07e6c437'
    - key_pair: 'my-key'

  tasks:

    - name: Create a centos server
      ec2:
        region: 'us-west-2'
        key_name: '{{ key_pair }}'
        group: default
        instance_type: t2.micro
        image: '{{ ami }}'
        wait: true
        exact_count: 1
        count_tag:
          Name: my-instance
        instance_tags:
          Name: my-instance
      register: ec2

    # shows the json data for the instances created
    - name: Show ec2 instance json data
      debug:
        msg: "{{ ec2['tagged_instances'] }}"

    - name: Wait for SSH to come up
      wait_for: host={{ ec2['tagged_instances'][0]['public_ip'] }} port=22 delay=1 timeout=480 state=started

    - name: Accept new ssh fingerprints                                       
      shell: ssh-keyscan -H "{{ ec2['tagged_instances'][0]['public_ip'] }}" >> ~/.ssh/known_hosts          

    # THE TASKS I NEED HELP ON
    - name: Copy files over to ec2 instance
      remote_user: centos 
      copy: src={{ item }} dest=/home/centos/ mode=600
      with_fileglob:
        - my-files/*
      delegate_to: "{{ ec2['tagged_instances'][0]['public_ip'] }}"   

    # THE TASKS I NEED HELP ON
    - name: run commands
      remote_user: centos                                        
      shell: "{{ item }}"
      delegate_to: "{{ ec2['tagged_instances'][0]['public_ip'] }}"
      with_items:
        - "sudo yum update -y"
        - "sudo yum install nmap ruby"
      ignore_errors: true 

Upvotes: 2

Views: 4564

Answers (1)

2ps
2ps

Reputation: 15966

Yeah, I agree with @techraf. But the answer to the question you posted is that you have to dynamically change your inventory for the new instance that you provisioned and then run remote ansible plays on that new host. So you would add this to the end of your first play:

    - local_action:
        module: add_host
        hostname: newhost
        ansible_host: "{{ ec2['tagged_instances'][0]['public_ip'] }}"
        ansible_user: centos
        ansible_ssh_private_key_file: /path/to/keyfile

###### New play
- name: Configure my new instance!
  hosts: newhost
  tasks:
    # THE TASKS I NEED HELP ON
    - name: Copy files over to ec2 instance
      copy: src={{ item }} dest=/home/centos/ mode=600
      with_fileglob:
        - my-files/*
    # Use the yum module here instead, much easier
    - name: run commands
      shell: "{{ item }}"
      with_items:
        - "sudo yum update -y"
        - "sudo yum install nmap ruby"
      ignore_errors: true 

Edit: Adding, that you can always just set the ssh host key by using:

- set_fact: ansible_ssh_private_key_file=/path/to/keyfile

with the caveat that the above set_fact will only change the ssh private key file for the currently running host (e.g., for localhost on your example play above).

Upvotes: 1

Related Questions