Roy
Roy

Reputation: 23

How to SSH between 2 Google Cloud Debian Instances

I have installed ansible in on of my GCE Debian VM Instance(1). Now I want to connect to another GCE Debian VM instance(2). I have generated the public key on Instance 1 and copied the .pub key manually to the the authorized key of instance 2. But, when I try to do the ssh from 1 to 2 it gives permission denied.

Is there any other way round? I am a little new to this, trying to learn. is there any step by step guide available? and also what is the exact ip address to do ssh on? will it be the internal IP or the External IP taken by GCE when the Instance is started.

Upvotes: 2

Views: 1365

Answers (2)

Alioua
Alioua

Reputation: 1776

It's quite simple if you have two instances in google cloud platform, automatically you have the guest environment installed (gcloud command line), with it you can ssh through all you ssh inside your project:

Just run the following command line for inside your instance A to reach the Instance B

[user@Instance(1)]$ gcloud compute ssh Instance(2) --zone [zone]

That it, if it's not working let me know, and verify if your firewall rule let internal traffic.

Upvotes: 0

Chris McCauley
Chris McCauley

Reputation: 26363

I'm an Ansible user too and I manage a set of compute engine servers. My scenario is pretty close to yours so hopefully this will work for you as well. To get this to work smoothly, you just need to realise that ssh public keys are metadata and can be used to tell GCE to create user accounts on instance creation.


SSH public keys are project-wide metadata

To get what you want the ssh public key should be added to the Metadata section under Compute Engine. My keys look like this:

ssh-rsa AAAAB3<long key sequence shortened>Uxh bob

Every time I get GCE to create an instance, it creates /home/bob and puts the key into the .ssh/authorized_keys section with all of the correct permissions set. This means I can ssh into that server if I have the private key. In my scenario I keep the Private Key only in two places, LastPass and my .ssh directory on my work computer. While I don't recommend it, you could also copy that private key to the .ssh directory on each server that you want to ssh from but I really recommend getting to grips with ssh-agent


Getting it to work with Ansible

The core of this is to tell Ansible not to validate host checking and to connect as the user specified in the key (bob in this example). To do that you need to set some ssh options when calling ansible

ansible-playbook -ssh-common-args='-o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no' -u bob 

Now Ansible will connect to the servers mentioned in your playbook and try to use the local private key to negotiate the ssh connection which should work as GCE will have set things up for you when the VM is created. Also, since hostname checking is off, you can rebuild the VM as often as you like.


Saying it again

I really recommend that you run ansible from a small number of secure computers and not put your private key onto cloud servers. If you really need to ssh between servers, look into how ssh-agent passes identity around. A good place to start is this article.


Where did you say the metadata was?

I kind of glossed over that bit but here's an image to get you started.

enter image description here

From there you just follow the options for adding a public key. Don't forget that this works because the third part of the key is the username that you want GCE and Ansible to use when running plays.

Upvotes: 2

Related Questions