wherby
wherby

Reputation: 784

Google Cloud Platform: SSH to Google cloud instance will have "Permission denied (publickey)"

I have come across the issue as below when I use ssh login google cloud instance

$ ssh -i DD2 [email protected]
Permission denied (publickey).

After some testing, I found that the cause of the error is that public key signature is not consistent with the account for google cloud:

For example :

scuio33@chef-server:~$ 

here you account is scuio33 then your pub file will be :

ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDBpNeFZyXXXehjPuGCkEjb/t
laNQt0fztORSCFFQIoKHkQzi7SNhp48kagyOHDNj6mY1LmVZB/sIj2oCa1AFupoFuBYc/XILP
rTX60fIlnBYkHl+6Kq/TX2hzKv scuio33

scuio33 will be exactly same as your google account, or there will have the issue "Permission denied (publickey)". Only google cloud has this restriction.

This is not a "question". But a hint for ssh to google cloud failure.

Upvotes: 31

Views: 90009

Answers (14)

bakursait
bakursait

Reputation: 49

well.. I tried the above answers from @"Antonin GAVREL" and @"Promise Preston" but still getting the same error.

I noticed this file ~/.ssh/authorized_keys is not created on the VM, which it should. So, I considered all the suggestions and added the following:

Step1: on my computer

  • copied the public SSH key from my computer. it was ~/.ssh/id_rsa.pub

Step2: At GCP:

  • if ~/.ssh directory is not set yet, as in my case, then create it manually or ssh-keygen to create a pair of keys.
  • create an ~/.ssh/authorized_keys file, if not there.
  • paste my local machine's SSH public key. ssh-rsa AAAAB3Nz...2M78= bakursait

I could ssh the VM now.

Upvotes: 0

Promise Preston
Promise Preston

Reputation: 29078

I experienced this issue when trying to set up Kubernetes for the first time on Google Cloud Platform.

I was running into the error below each time I tried to SSH into my instance from my terminal:

[email protected]: Permission denied (publickey)

Here's how I solved it:

Open a terminal on your workstation and use the ssh-keygen command to generate a new key. Specify the -C flag to add a comment with your username.

ssh-keygen -t rsa -f ~/.ssh/[KEY_FILENAME] -C [USERNAME]

In my case it was:

ssh-keygen -t rsa -f ~/.ssh/kubernetes-trial -C promisepreston

Navigate into the .ssh directory:

cd ~/.ssh

Restrict access to your private key so that only you can read it and nobody can write to it.

chmod 400 [KEY_FILENAME]

In my case it was:

chmod 400 kubernetes-trial

Double click on kubernetes-trial.pub to open it OR print it on the console using the cat command:

sudo cat kubernetes-trial.pub

The public SSH key should be of this format:

ssh-rsa [KEY_VALUE] [USERNAME]

OR

ssh-rsa [KEY_VALUE] google-ssh {"userName":"[USERNAME]","expireOn":"[EXPIRE_TIME]"}

In my case it was:

ssh-rsa AAAAB3MzaC1yc2EAAAADAQABAAABAQDdLjLb2b97m9NSK5Z8+j6U8awAwIx1Sbn9o4cEpYT2USYlFhJPRckgnmCQ+Eaim/sgL40V2v3Jwt6HVAY0L9bl84jmvox9QP4FOY7+LM02ZqfRB6LaEukM1tGdObVr+HBvhOwrxGCI06GFjnD3vVzW4jEsK75Y7MPzXd5YSpebGvU+7ZOuEcuSKp/R9dJcJn4kdXeaqor4gh8uTKQ43PGPTEvyoNlCWLkwSgy8khbo2BpoChLA7B53pVEhviMvVVIbmwpc6V2AIhRYY7ppR8oBzklLgh8CtTBPXtQRYiahLOIhds6ORf7wGNFI+A4sbBqwEL3J6av5fE1+zkUBhAHX promisepreston

Copy its contents and paste in the SSH Section of your instance under the Metadata section Adding or removing instance-level public SSH keys

ssh keys

In a local terminal, navigate to the directory where you have the private SSH key file, use the ssh command along with your private SSH key file, the username, and the external IP address of the instance to connect. For example:

ssh -i private-key username@external-ip-of-the-virtual-instance

In my case it was:

ssh -i kubernetes-trial [email protected]

After you connect, run commands on your instance using this terminal. When you finish, disconnect from the instance by running the exit command.

Note:

Upvotes: 63

x-yuri
x-yuri

Reputation: 18973

If your question is not a question, then my answer is not an answer I guess. Anyways, it's not clear what should match what. The comment in ~/.ssh/authorized_keys on a CE instance and the user name you provide to an ssh command? That's totally not the case:

$ ssh [email protected] cat .ssh/authorized_keys
# Added by Google
ssh-rsa ... yuri@yuri

The reason I received "Permission denied (publickey)" was that I was using gcloud compute ssh INSTANCE in a docker container running as root. Like a person here. And by default this command uses the local user name (root in my case). But:

By default, Compute Engine VMs built from public images and most common operating systems don't allow root login with a password over SSH.

https://cloud.google.com/compute/docs/connect/root-ssh

Launching a CE instance for the first time I was confused. Which user name I should use and how do I add a public key? The answer is, it appears, by adding the key to project or instance metadata. The user you specify with the key is automatically created. Alternatively you can use gcloud compute ssh INSTANCE or SSH-in-browser. But make sure you're not running under root, root logins are allowed (PermitRootLogin), or you specify a non-root username in case of the the gcloud command.

Upvotes: 0

Antonin GAVREL
Antonin GAVREL

Reputation: 11259

You have to make sure that the same username you used to generate the key matches the one from your local cpu.


Step-by-steps:

1/ Generate the key (no paraphrase)

On local cpu: By default on linux $USER will give you your username (echo $USER), so you don't even need to specify it.

ssh-keygen -t rsa -f ~/.ssh/my_google_cloud_key -C $USER

2/ Copy the key to Google Cloud metadata

cat /home/$USER/.ssh/my_google_cloud_key.pub

Select and copy it in https://console.cloud.google.com/compute/metadata/sshKeys (add key, then save)

3/ Connect to your VM

Get the external IP of your instance at https://console.cloud.google.com/compute/instances

EXTERNAL_IP={{input your external ip}}
ssh -i ~/.ssh/my_google_cloud_key $USER@$EXTERNAL_IP

Upvotes: 11

carbocation
carbocation

Reputation: 9548

I was trying to do this as user jupyter. Prompted by @maximusX3's answer, which suggested to make a change in the Metadata, I pulled up my GCP metadata's SSH keys subpage.

Interestingly, when I loaded the page, I had duplicate SSH keys for the jupyter user, and the interface automatically prompted me to delete duplicate SSH keys. (You can do this through the GCP interface with one click.) After doing so, I was able to ssh in as jupyter using the usual gcloud compute ssh jupyter@machine command without any other local or remote changes.

Upvotes: 0

razimbres
razimbres

Reputation: 5015

In my case, despite uploading the public key to the VM and configuring ssh file in Visual Studio with private key, I was unable to connect, for the same reason.

For me, this is what solved. You follow the steps of adding .pub file to instance metadata, hold you private key in the ssh config file, then run:

gcloud compute ssh --zone "your-vm-zone" "your-instance"  --project "your_project"

This will generate a id_rsa file recognized by the Google Cloud VM. Then you can connect regularly using SSH, in my case, in Visual Studio Code.

Upvotes: 0

hbceylan
hbceylan

Reputation: 1282

I know there are similar answers but I am lost in the answers, easly;

for the existing ssh key

1- Copy your public key like the following format, i.e: ssh-rsa AAAAB3NzaC1yc2EAAAADAQABAAABAQDAu5kKQCPF... hbceylan

2- GCP > Edit VM > add SSH key > SAVE

enter image description here

3- SSH to the instance ssh -i /path-to-private-key/id_rsa [email protected]

for a new ssh-key

1- Create an ssh key, i.e:

ssh-keygen -t rsa -f ~/.ssh/x-poject-hbceylan -C hbceylan -b 2048

2- GCP > Edit VM > add SSH key > SAVE (see picture)

3- SSH to the instance

ssh -i /path-to-private-key/id_rsa [email protected]

Upvotes: 2

Salman
Salman

Reputation: 1040

Had the same error. try to add --tunnel-through-iap

gcloud compute ssh --zone "%zone%" "%instanaceName%" --project "%projectName%"  --tunnel-through-iap

Replace the following:

  • %zone% - Enter the zonal (location) name of the compute engine
  • %instanceName% - Name of the instance(compute engine)
  • %projectName% - Google cloud current project name

Upvotes: 1

user3199143
user3199143

Reputation: 1

For what it's worth I had to add the SSH key inside of "edit [name] instance"-> Security and access --> SSH Keys, I added the .pub key as mentioned above.

Upvotes: 0

Dasith Rathnasinghe
Dasith Rathnasinghe

Reputation: 35

Go to project-level metadata (Compute Engine -> Metadata) and ensure that you have either no enable-oslogin key or that it is set to FALSE.This worked for windows powershell

Upvotes: 0

maximusX3
maximusX3

Reputation: 23

Promise Preston`s solution works, just make sure you disable "ENABLE-OSLOGIN" or remove this from the metadata settings.

Upvotes: 2

wenzhan
wenzhan

Reputation: 21

promise-preston's answer works for me. The problem is that the name not match google account name.

Take into consideration that, if not specified, the name google ssh server recognize is automatically generated based on your email address BUT not necessarily the same. In my case my email address is [email protected], and the user name is actually wenzhan_main. Double check that trick.

One more small thing i want to add on is that, something like ssh -i kubernetes-trial [email protected], will only work if you are already in .ssh/ directory.

Upvotes: 2

user13685912
user13685912

Reputation: 21

Ran into the same error and resolved by unchecking "Block project-wide SSH keys" on the "VM instance details" settings page. Apparently this blocks ssh-ing from your local to GCE instances. enter image description here

Upvotes: 1

Gery
Gery

Reputation: 41

Connecting with an SSH key to a Google Cloud Compute Engine instance is not limited to the users of the project the instance belongs to. You can generate an SSH key and as long as it’s added to the instance and the user exists on the OS, you should be able to SSH. You can connect with other usernames. Make sure that:

  1. You add the public key to the instance via the Google Cloud Console [1]

  2. Your username exists on the OS of your instance

  3. If you want to SSH as "root", change the configuration in your /etc/ssh/sshd_config file.

Upvotes: 4

Related Questions