Jay
Jay

Reputation: 51

No SSH tunnels currently open. Were the targets able to accept an ssh-key for use

I created a kubernetes cluster on GCP with the vm instance having ubuntu images. I have deployed a postgres stateful set but when I try to check the logs of the pod I end up seeing the following.

kubectl logs pgset-1
Error from server: Get https://10.140.0.5:10250/containerLogs/default/pgset-1/pgset: No SSH tunnels currently open. Were the targets able to accept an ssh-key for user "gke-2bdc75f7d50bd7720226"?


kubectl get pods
NAME                                                             READY     STATUS    RESTARTS   AGE
maya-apiserver-5b48756968-95887                                  1/1       Running   0          1h
openebs-provisioner-7b59878f76-9w5z8                             1/1       Running   0          1h
pgset-0                                                          1/1       Running   0          12m
pgset-1                                                          1/1       Running   0          10m
pvc-8174e1f6-332f-11e8-85cd-42010af001b0-ctrl-fb5767469-jmjb2    2/2       Running   0          12m
pvc-8174e1f6-332f-11e8-85cd-42010af001b0-rep-8fd886589-tkvdq     1/1       Running   0          12m
pvc-8174e1f6-332f-11e8-85cd-42010af001b0-rep-8fd886589-wzr25     1/1       Running   0          12m
pvc-8174e1f6-332f-11e8-85cd-42010af001b0-rep-8fd886589-xvvfk     1/1       Running   0          12m
pvc-c34d6531-332f-11e8-85cd-42010af001b0-ctrl-6dd8948cbd-lz7dj   2/2       Running   0          10m
pvc-c34d6531-332f-11e8-85cd-42010af001b0-rep-64bdd45fc7-7fpnv    1/1       Running   0          10m
pvc-c34d6531-332f-11e8-85cd-42010af001b0-rep-64bdd45fc7-cf6w9    1/1       Running   0          10m
pvc-c34d6531-332f-11e8-85cd-42010af001b0-rep-64bdd45fc7-pg7bz    1/1       Running   0          10m



kubectl exec -it pgset-0 bash
Error from server: error dialing backend: No SSH tunnels currently open. Were the targets able to accept an ssh-key for user "gke-2bdc75f7d50bd7720226"?

What could possibly be the issue here. What am I doing wrong.

Upvotes: 4

Views: 3152

Answers (2)

Lokesh kumar
Lokesh kumar

Reputation: 137

It might be the problem with firewall rule. But in my case I was previously able to exec into pods and one day I got below issue:

"error: error upgrading connection: error dialing backend: No SSH tunnels currently open. Were the targets able to accept an ssh-key for user "gke-***"

I have just reconnected with Kubernetes Cluster and started working for me.

Upvotes: 0

suren
suren

Reputation: 8766

It could be because there is no firewall rule in your project to allow ssh traffic from the master to the nodes. Or it is wrong. This is what I would do:

  1. run kubectl cluster-info and take note of the master IP address.
  2. Check in your firewall rules, if there is a rule gke-YOUR_CLUSTER_NAME-NUMBER-ssh.
  3. If there is, check if the rule makes sense (matches your Master IP, etc.) If it there is not, or it doesn't match, create one with following characteristics:
    • Target tags: copy and paste your nodes "Network Tag". Should be something similar to gke-YOUR_CLUSTER_NAME-NUMBER-node.
    • Type: Ingress
    • Source IP ranges: MASTER_IP/32
    • Priority: 1000
    • Protocols and ports: tcp:22

Hope it works

Upvotes: 5

Related Questions