Duy Phan
Duy Phan

Reputation: 33

Ansible wait for initializing host before doing tasks in playbook

In my host, it needs time (about 20s) to initialize CLI session,... before doing cli

CLI init

I'm trying to do command by playbook ansible:

---
- name: Run show sub command
  hosts: em
  gather_facts: no
  remote_user: duypn



  tasks:
   - name: wait for SSH to respond on all hosts
     local_action: wait_for host=em port=22 delay=60 state=started

   - name: run show sub command
     raw: show sub id=xxxxx;display=term-type

After 10 mins, ansible gives me output which is not the result of show sub command :(

...
["CLI Session initializing..", "Autocompleter initializing..", "CLI>This session has been IDLE for too long.", 
...

I'm glad to hear your suggestion. Thank you :)

Upvotes: 0

Views: 3127

Answers (2)

David Wer
David Wer

Reputation: 418

So I had the same problem and this is how I solved it:

  ---

  - name:               "Get instances info"
      ec2_instance_facts:
          aws_access_key:   "{{ aws_access_key }}"
          aws_secret_key:   "{{ aws_secret_key }}"
          region:           "{{ aws_region }}"
          filters:
            vpc-id :         "{{ vpc_id }}"
            private-ip-address: "{{ ansible_ssh_host }}"
      delegate_to:         localhost
      register:            my_ec2


    - name:                "Waiting for {{ hostname }} to response"
      wait_for:
        host:              "{{ item.public_ip_address }}"
        state:             "{{ state }}"
        sleep:             1
        port:              22
      delegate_to:         localhost
      with_items:
           - "{{ my_ec2.instances }}"

That is the playbook named aws_ec2_status.

The playbook I ran looks like this:

---
# Create an ec2 instance in aws

- hosts:                nodes
  gather_facts:         false
  serial:               1
  vars:
    state:              "present"
  roles:
   - aws_create_ec2


- hosts:                nodes
  gather_facts:         no
  vars:
    state:              "started"
  roles:
    - aws_ec2_status

The reason I split the create and check to two different playbooks is because I want the playbook to create instances and not wait for one to be ready before creating the other. But if the second instance is depended on the first one so you should combine them.

FYI Let me know if you want to see my aws_create_ec2 playbook.

Upvotes: 0

Evert
Evert

Reputation: 308

I don't have a copy-paste solution for you but one thing I learned is to put a sleep after ssh is 'up' to allow the machine to finish it's work. This might give you a nudge in the right direction.

- name: Wait for SSH to come up
  local_action: wait_for
                host={{ item.public_ip }}
                port=22
                state=started
  with_items: "{{ ec2.instances }}"

- name: waiting for a few seconds to let the machine start
  pause:
    seconds: 20

Upvotes: 1

Related Questions