Reputation: 1284
I am a docker swarm noob. I have two python Flask apps that are being migrated to run as docker services, and one needs to call the other. I am using nginx reverse proxy to manage external connections to the services.
The nginx locations settings are:
location /alpha/ {
proxy_pass https://alpha-app:5000/;
}
location /beta/ {
proxy_pass https://beta-app:5001/;
}
When running in docker swarm, where "demo" is the stack name:
$ sudo docker service ls:
NAME PORTS
demo_alpha *:3002->5000/tcp
demo_beta *:3001->5001/tcp
demo_nginx *:443->443/tcp
I can access the services externally at:
https://my-host/alpha/some_endpoint
https://my-host/beta/some_endpoint
Now I need to have alpha call a service in beta. If I run the apps in regular docker containers, then the following call from alpha to beta works:
url = https://my-host/beta/some_endpoint
requests.get(url, cert, verify)
Note that when running in docker swarm, the apps are running on different hosts, but using the same network. I can't get the app to app connection to work when the apps are running as services in a docker swarm. I can still call each app service from outside of the swarm:
https://my-host/alpha/some_endpoint -> works
https://my-host/beta/some_endpoint -> works
I cannot get alpha to consume a service from beta. I have tried just using the service name:
url = https://beta-app/some_endpoint -> connection refused
url = https://beta-app:5001/some_endpoint -> hostname doesn't match
url = https://my-host/beta/some_endpoint > name or service not known
requests.get() always fails
What is the correct url to use for one docker swarm service to call another? Do I need to look up the service's internal IP?
Upvotes: 4
Views: 3709
Reputation: 166
I do not yet have enough reputation to post this as a comment.
Ram Idavalapati's answer is incorrect in the sense that this would work only if the containers for the service are all on the same node OR on the same subnet. It is not possible to access a port of another service residing on a different node which is not on the same subnet (for example, different availability zones or different clouds) using the service name.
I have created issues on SO and github for the same. I'm writing this here so that anyone who comes looking knows that he/she is not alone!
Upvotes: 2
Reputation: 714
Using Service name
as host
will help to make communication between two docker services/containers running in docker swarm on same overlay network.
update Ref: https://docs.docker.com/network/overlay/#container-discovery
Example stack file: test.yml
version: "3.4"
services:
# This is the service name which is used in master as host.
# ex: http://shard:<port>
shard:
image: ramidavalapati/shard:0.1
deploy:
restart_policy:
condition: on-failure
networks:
- abc
master:
image: ramidavalapati/master:0.1
deploy:
restart_policy:
condition: on-failure
ports:
- 5000:80
networks:
- abc
networks:
abc:
driver: overlay
deploy: sudo docker stack deploy -c test.yml test
API call: curl http://localhost:5000
.
This call will go to master service and master service will make call to shard service.
Master (app.py):
import urllib
from flask import Flask
app = Flask(__name__)
@app.route("/")
def hello():
r = urllib.urlopen("http://shard:80")
return r.read()
if __name__ == '__main__':
app.run(host='0.0.0.0', port=80)
shard(app.py):
from flask import Flask
app = Flask(__name__)
@app.route("/")
def hello():
return "Hello World"
if __name__ == "__main__":
app.run(host='0.0.0.0', port=80)
Dockerfile for both master and shard:
FROM python:2.7-slim
RUN pip install Flask
ADD . .
CMD ["python", "app.py"]
Upvotes: 2