Reputation: 999
Is it possible to take ssh to pod?
Eg: ssh pod_ip
I know we can do this with the kubectl
command. But I need to do ssh
from my local linux machine which doesn't have kubectl
.
Upvotes: 99
Views: 175006
Reputation: 9
You can try, OpenPubkey to SSH Without SSH Keys. Following are the links for more information.
Upvotes: 0
Reputation: 357
for anyone looking for a one liner, this should do it.
Given:
Running, gets you bash in the first pod:
kubectl exec -i -t $(kubectl get pod --namespace cars --selector='app=hud' --output jsonpath='{.items[0].metadata.name}') -n cars -- /bin/bash
Where this extracts the 1st pod when listing pods:
$(kubectl get pod --namespace cars --selector='app=hud' --output jsonpath='{.items[0].metadata.name}')
Upvotes: 5
Reputation: 65
I use:
kubectl exec -it deployments.apps/deployment-example bash -n namespace-example
Upvotes: -2
Reputation: 6302
If you would like to login inside a particular container in the POD
kubectl exec -it <Pod_Name> -c <Container_Name> -- /bin/bash
If you would like to login to default container (or if there is only one container in POD) then just use
kubectl exec -it <Pod_Name> -- /bin/bash
PS:- If /bin/bash is not working try /bin/sh
Upvotes: 89
Reputation: 4799
Firstly, you have to ensure that the openssh-server has been installed and running in the pod. If not, you can use kubectl exec -it <pod-name> -n <namespace> -- bash
to access the pod.
If your pod are running Ubuntu, do apt-get install -y openssh-server
.
Secondly, pods are running in a virtual IP subnet assigned by network service. They are accessible to any Master nodes and Worker nodes in the cluster. You can do ssh from any of the Host OS.
Upvotes: 85