Reputation: 165192
How do you configure fabric to connect to remote hosts using SSH keyfiles (for example, Amazon EC2 instances)?
Upvotes: 105
Views: 65186
Reputation: 2059
None of these answers worked for me on py3.7, fabric2.5.0 and paramiko 2.7.1.
However, using the PKey attribute in the documentation does work: http://docs.fabfile.org/en/2.5/concepts/authentication.html#private-key-objects
from paramiko import RSAKey
ctx.connect_kwargs.pkey = RSAKey.from_private_key_file('path_to_your_aws_key')
with Connection(ctx.host, user, connect_kwargs=ctx.connect_kwargs) as conn:
//etc....
Upvotes: 2
Reputation: 5611
For fabric2 in fabfile use the following:
from fabric import task, Connection
@task
def staging(ctx):
ctx.name = 'staging'
ctx.user = 'ubuntu'
ctx.host = '192.1.1.1'
ctx.connect_kwargs.key_filename = os.environ['ENV_VAR_POINTS_TO_PRIVATE_KEY_PATH']
@task
def do_something_remote(ctx):
with Connection(ctx.host, ctx.user, connect_kwargs=ctx.connect_kwargs) as conn:
conn.sudo('supervisorctl status')
and run it with:
fab staging do_something_remote
UPDATE:
For multiple hosts (one host will do also) you can use this:
from fabric2 import task, SerialGroup
@task
def staging(ctx):
conns = SerialGroup(
'[email protected]',
'[email protected]',
connect_kwargs=
{
'key_filename': os.environ['PRIVATE_KEY_TO_HOST']
})
ctx.CONNS = conns
ctx.APP_SERVICE_NAME = 'google'
@task
def stop(ctx):
for conn in ctx.CONNS:
conn.sudo('supervisorctl stop ' + ctx.APP_SERVICE_NAME)
and run it with fab or fab2:
fab staging stop
Upvotes: 21
Reputation: 1782
I had to do this today, my .py file was as simple as possible, like the one posted in the answer of @YuvalAdam but still I kept getting prompted for a password...
Looking at the paramiko
(the library used by fabric for ssh) log, I found the line:
Incompatible ssh peer (no acceptable kex algorithm)
I updated paramiko
with:
sudo pip install paramiko --upgrade
And now it's working.
Upvotes: 7
Reputation: 3722
For me, the following didn't work:
env.user=["ubuntu"]
env.key_filename=['keyfile.pem']
env.hosts=["xxx-xx-xxx-xxx.ap-southeast-1.compute.amazonaws.com"]
or
fab command -i /path/to/key.pem [-H [user@]host[:port]]
However, the following did:
env.key_filename=['keyfile.pem']
env.hosts=["[email protected]"]
or
env.key_filename=['keyfileq.pem']
env.host_string="[email protected]"
Upvotes: 15
Reputation: 11
As stated above, Fabric will support .ssh/config file settings after a fashion, but using a pem file for ec2 seems to be problematic. IOW a properly setup .ssh/config file will work from the command line via 'ssh servername' and fail to work with 'fab sometask' when env.host=['servername'].
This was overcome by specifying the env.key_filename='keyfile' in my fabfile.py and duplicating the IdentityFile entry already in my .ssh/config.
This could be either Fabric or paramiko, which in my case was Fabric 1.5.3 and Paramiko 1.9.0.
Upvotes: 1
Reputation: 165192
Another cool feature available as of Fabric 1.4 - Fabric now supports SSH configs.
If you already have all the SSH connection parameters in your ~/.ssh/config
file, Fabric will natively support it, all you need to do is add:
env.use_ssh_config = True
at the beginning of your fabfile.
Upvotes: 67
Reputation: 6752
Also worth mentioning here that you can use the command line args for this:
fab command -i /path/to/key.pem [-H [user@]host[:port]]
Upvotes: 71
Reputation: 165192
Finding a simple fabfile with a working example of SSH keyfile usage isn't easy for some reason. I wrote a blog post about it (with a matching gist).
Basically, the usage goes something like this:
from fabric.api import *
env.hosts = ['host.name.com']
env.user = 'user'
env.key_filename = '/path/to/keyfile.pem'
def local_uname():
local('uname -a')
def remote_uname():
run('uname -a')
The important part is setting the env.key_filename
environment variable, so that the Paramiko configuration can look for it when connecting.
Upvotes: 155