Reputation: 5150
I have a tiny socket server in a docker container the server looks like
var app = require('express')();
var server = require('http').Server(app);
var io = require('socket.io')(server, {origins: 'localhost:*'});
io.on('connection', function (socket) {
console.log('Connected');
});
const PORT = 8081;
const HOST = '0.0.0.0';
server.listen(PORT, HOST);
and the docker file is
FROM keymetrics/pm2-docker-alpine:latest
WORKDIR /root
RUN apk update && \
apk upgrade && \
apk add git
ENV HOME /root
COPY socket.js ./
COPY package.json ./
RUN npm install
COPY pm2.json ./
EXPOSE 8081
CMD [ "pm2-docker", "start", "pm2.json" ]
pm2.json looks like
{
"apps": [{
"name": "socket-server",
"script": "socket.js",
"exec_mode" : "cluster",
"instances" : 2,
"env": {
"production": true
}
}]
}
package.json
{
"name": "socket-server",
"version": "1.0.0",
"description": "",
"main": "socket.js",
"scripts": {
"test": "echo \"Error: no test specified\" && exit 1"
},
"author": "",
"license": "ISC",
"dependencies": {
"express": "^4.15.3",
"socket.io": "^2.0.3"
}
}
It all runs just fine with
docker run -d -p 8081:8081 socket-server
until I try to connect to it from a website running in another container, the website connects like this...
<script src="socket.io.js"></script>
<script>
var socket = io.connect('http://localhost:8081');
socket.on('connect', function(data) {
console.log('Connected Client')
});
</script>
and in the console, it shows that it polls just fine with
Request URL:http://localhost:8081/socket.io/?
EIO=3&transport=polling&t=LthQCgI&sid=93sOyTiSOe5RVOdEAAAL
Request Method:POST
Status Code:200 OK
but fails to get a socket connection
Request URL:ws://localhost:8081/socket.io/?
EIO=3&transport=websocket&sid=93sOyTiSOe5RVOdEAAAL
Request Method:GET
Status Code:400 Bad Request
Now if I run the socket server, not in the docker container it's fine and the socket connects.
I have tried getting the IP of the container that the socket server is running and using that in the connection script but even the polling doesn't work when I configure it like that.
I really need this inside a Docker container.
Any help is most appreciated
Upvotes: 10
Views: 16956
Reputation: 482
Although this is an old question I figured I would elaborate a little bit on it for anyone else that is wondering how you would go about connecting containers since the previous answer was a bit slim.
Using swarm in this case would be overkill especially for something like running the containers locally in a manner that would allow them to talk. Instead you simply want to establish the containers on the same docker network.
version: '3.5'
services:
app:
build:
context: .
dockerfile: Dockerfile
command: app
ports:
- "4000:4000"
volumes:
- .:/app
networks:
- app-network
pgsql:
image: postgres:latest
ports:
- "5432:5432"
volumes:
- postgres_data:/var/lib/postgresql/data
networks:
- app-network
networks:
app-network:
driver: bridge
volumes:
postgres_data:
driver: local
In the docker-compose.yml file example above you can see that I am defining a network via:
networks:
app-network:
driver: bridge
Then on both the app and pgsql containers I am assigning them to that network via:
networks:
- app-network
This allows me to access the containers from one another via the container "name". So in my code I am now able to use pgsql:5432 and communicate with the postgres service. The app container is also reachable from the pgsql container via app:4000.
While this can get much more complex than the above example I figured I leave a working docker-compose.yml example above. You can find out more about docker networks at https://docs.docker.com/compose/networking/
Upvotes: 7
Reputation: 81
maybe you should try to make a docker swarm and let the containers join the same network:
....
version: '3.5'
services:
myserver:
image: 'mydocker-image'
networks:
- mynetwork
....
and access the server like this http://myserver:8081
Upvotes: 0