Reputation: 4393
I run a kubernetes cluster on a 'bare metal' Ubuntu machine, as described here http://kubernetes.io/docs/getting-started-guides/ubuntu/. After I create a LoadBalancer service, I can see on which ip address it runs:
kubectl describe services sonar
Name: sonar
IP: 10.0.0.170
Port: <unset> 9000/TCP
Endpoints: 172.17.0.2:9000
. . .
Then I expose this to the world with nginx running outside of the kubernetes cluster. Great, but on the next service deployment the ip changes. How can I deal with this? Fix the ip, use environment vars, any other way?
Upvotes: 0
Views: 2045
Reputation: 126
Disclaimer: I work for Stackpoint, after studying different choices we decided to use ingress controllers for our product so my answer is biased to ingresses.
With ingress
+ ingress Controller
you can balance external loads to the pods endpoints. While services are resources whose main target is to track pods and create routes (among other things), ingress is a much better way of defining balancing rules. By now it:
The big disadvantage with ingress is that you need an ingress controller that listen for Ingress, resolve endpoints, communicate config changes to the balancer and reload if necessary. Since we are in control of what the Ingress will tell the balancer, we can configure keepalives, sticky sessions, healthchecks, ... etc.
Using services you are not in full control of all those parameters.
There is an nginx example at kubernetes/contrib that should match most scenarios. At Stackpoint we are using our own haproxy Ingress controller and are quite happy with the results (and will have Ingress management from our UI in short)
The ingress kubernetes page contains more info, and at the bottom, a section with some links to the alternatives.
Upvotes: 2
Reputation: 297
Without having seen your service definition it sounds to me like you want a NodePort
type of service rather than a LoadBalancer
. With a NodePort service you would simply point NGINX to the IP address of the Ubuntu machine and the port specified in the service definition. As long as the address of the Ubuntu machine is stable you should be fine.
If you run Kubernetes on multiple machines you simply add the IP addresses of all machines to your NGINX machine and let it do the load balancing.
More information about the different service types is available here: http://kubernetes.io/docs/user-guide/services/#publishing-services---service-types
Upvotes: 3