Reputation: 111
The environment I'm running in requires that the server be stopped before web application files can be copied over. So I'd like to perform the following set of tasks using fabric:
If I want to deploy to 20 hosts and I do something like this...
def deploy:
run("server stop")
run("rsync ...")
run("server start")
...there will be unnecessary downtime as every single server is first brought down, then files are sync'd to all servers, and finally all servers are brought back up. This stems from the fact that the "run" command is executed on each host.
Is there an elegant way to have multiple commands run on each host? This is a naive stab at what I'm looking for:
for host in env.hosts
env.host = host
run("server stop")
run("rsync ...")
run("server start")
I did see that there is the ability to have "run" commands occur in parallel but thats not exactly what I'm looking for.
Upvotes: 1
Views: 3236
Reputation: 1564
I think your first example should do what you want.
This will run all three commands on server A first, then on server B, and so on...
def deploy():
run("server stop")
run("rsync ...")
run("server start")
If you wanted to stop all servers, then update all servers, then finally restart them all, you could write:
def stop_server():
run("server stop")
def update_server():
run("rsync ...")
def start_server():
run("server start")
@runs_once
def deploy():
execute(stop_server)
execute(update_server)
execute(start_server)
Upvotes: 1
Reputation: 1830
You can use multithreading to do this, but fabric already do it for you in parallel module.
e.g.:
from time import sleep
from fabric.api import parallel, run, execute
@parallel
def parallel_ls():
run("ls /")
sleep(5)
if __name__ == '__main__':
execute(parallel_ls, hosts=['127.0.0.1', 'localhost'])
Upvotes: 0