MattoTodd
MattoTodd

Reputation: 15209

Deploying to multiple EC2 servers with Fabric

I'm wondering if anyone has experience deploying to multiple servers behind a load balancer on ec2 with fabric

I have used fabric for a while now, and have no issues with it, or deploying to multiple servers, but what I would like to do in this scenario is (lets say I have ten instances running) de-register half (5) of the boxes from my load balancer, deploy my code to them and run a smoke test, and if everything looks good, register them with the load balancer again and de-register the remaining 5 instances and deploy to them, and then register them back to the load balancer.

I have no problem accomplishing any of the individual tasks (de-registering, running tests, deploying etc), I just don't know how to organize my hosts in a simple fashion so that I can deploy the first half, then the second half. Fabric seems to be set up to run the same tasks on all hosts in order (task 1 on host 1, task 1 on host 2, task 2 on host 1, task 2 on host 2 etc etc)

My first thought was to create a task to handle the first part of de-registering, deploying and testing, and then set the env.hosts for the second half of the servers, but i felt this seemed a bit hokey.

Has anyone modeled something similar to this with Fabric before?

Upvotes: 8

Views: 4487

Answers (6)

Simion Agavriloaei
Simion Agavriloaei

Reputation: 3697

Or you could simply write a method which sets some variables, for example:

def live():
    global PATH, ENV_PATH
    env.hosts = ["22.2.222.2"]
    env.user = 'test'
    PATH = '/path/to/project'
    # optional, is using virtualenv
    ENV_PATH = '/path/to/virtualenv'
    # overwri

te whatever variabled you need to change on the current machine

and before running the deploy command, run:

fab live deploy

Details: http://simionbaws.ro/programming/deploy-with-fabric-on-multiple-servers/

Upvotes: 0

Nathan Fisher
Nathan Fisher

Reputation: 11

I've successfully combined Fabric with boto. I populate the hosts list using boto. You can use the @parallel decorator to limit the number of hosts to execute in one go. The command looks as follows;

fab running deploy

The code looks like so;

@task
@runs_once
def running():
    ec2conn = ec2.connect_to_region(region)
    reservations = ec2conn.get_all_instances(filters={'instance-state-name': 'running'})
    instances = list(chain.from_iterable(map(lambda r: r.instances, reservations)))
    env.hosts = list(chain.from_iterable(map(lambda i: i.public_dns_name, instances)))

@task
@parallel(pool_size=5)
def deploy():
    # do stuff on n<=5 hosts in parallel

If you need to handle a subsection of hosts I'd suggest using tags.

Upvotes: 1

Tadeck
Tadeck

Reputation: 137350

You can simplify this by defining roles (used for aggregation of hosts) and executing your tasks on one role, then running tests and deploying on the second role.

Example of roledefs:

env.roledefs = {
    'first_half': ['host1', 'host2'],
    'second_half': ['host3', 'host4'],
}

def deploy_server():
    ...
    # deploy one host from current role here

def deploy():
    # first role:
    env.roles = ['first_half']
    execute('deploy_server')
    test()  # here test deployed servers
    # second role:
    env.roles = ['second_half']
    execute('deploy_server')

More links:

Upvotes: 5

Morgan
Morgan

Reputation: 4131

You want to use the execute() function. This will allow you to do something like this:

def update():
     deallocate()
     push_code()
     smoke_test() #could fail fast
     reallocate()

def deploy():
     execute(update, hosts=first_five)
     execute(update, hosts=last_five)

You could also make each of the deallocate, push_code, and smoke_test, tasks an execute() call in the deploy, and then you'd run all the deallocates then run all the code pushes, etc.

Then have a check of some sort and then proceed with the others running said tasks.

Upvotes: 3

rantanplan
rantanplan

Reputation: 7450

Fabric is not set-up to run the same tasks on all hosts.

Apart from the fact that you can explicitly set the hosts for a specific task with the -H command line parameter, you can use this pattern and this newer pattern to do exactly what you want.

Update: Here it shows how you can use roles

Upvotes: 0

spinlok
spinlok

Reputation: 3661

Rather than meddle with env.hosts, you could pass a list (or any iterable) to the hosts decorator. Something like:

def deploy(half_my_hosts):
    @hosts(half_my_hosts)
    def mytask():
        # ...
    mytask()

Then you could split your env.hosts in any way you like and pass it to deploy()

Upvotes: -1

Related Questions