eugene
eugene

Reputation: 41665

Docker, java.net.UnknownHostException: docker-desktop: docker-desktop: Name does not resolve

I am running docker containers successfully on ubuntu machines.

And I'm having trouble running the same docker on mac machines. I've tried on two macs, and the error messages are the same.

> spark-worker_1  | java.net.UnknownHostException: docker-desktop:
> docker-desktop: Name does not resolve spark-worker_1  |       at
> java.net.InetAddress.getLocalHost(InetAddress.java:1506)
> spark-worker_1  |       at
> org.apache.spark.util.Utils$.findLocalInetAddress(Utils.scala:946)
> spark-worker_1  |       at
> org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress$lzycompute(Utils.scala:939)
> spark-worker_1  |       at
> org.apache.spark.util.Utils$.org$apache$spark$util$Utils$$localIpAddress(Utils.scala:939)
> spark-worker_1  |       at
> org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:1003)
> spark-worker_1  |       at
> org.apache.spark.util.Utils$$anonfun$localHostName$1.apply(Utils.scala:1003)
> spark-worker_1  |       at scala.Option.getOrElse(Option.scala:121)
> spark-worker_1  |       at
> org.apache.spark.util.Utils$.localHostName(Utils.scala:1003)
> spark-worker_1  |       at
> org.apache.spark.deploy.worker.WorkerArguments.<init>(WorkerArguments.scala:31)
> spark-worker_1  |       at
> org.apache.spark.deploy.worker.Worker$.main(Worker.scala:778)
> spark-worker_1  |       at
> org.apache.spark.deploy.worker.Worker.main(Worker.scala)
> spark-worker_1  | Caused by: java.net.UnknownHostException:
> docker-desktop: Name does not resolve spark-worker_1  |       at
> java.net.Inet6AddressImpl.lookupAllHostAddr(Native Method)
> spark-worker_1  |       at
> java.net.InetAddress$2.lookupAllHostAddr(InetAddress.java:929)
> spark-worker_1  |       at
> java.net.InetAddress.getAddressesFromNameService(InetAddress.java:1324)
> spark-worker_1  |       at
> java.net.InetAddress.getLocalHost(InetAddress.java:1501)
> spark-worker_1  |       ... 10 more docker_spark-worker_1 exited with
> code 51

Here are my docker-compose.yml file

 services:

   spark-master:
     build:
       context: ../../
       dockerfile: ./danalysis/docker/spark/Dockerfile
     image: spark:latest
     container_name: spark-master
     hostname: node-master
     ports:
       - "7077:7077"
     network_mode: host
     environment:
       - "SPARK_LOCAL_IP=node-master"
       - "SPARK_MASTER_PORT=7077"
       - "SPARK_MASTER_WEBUI_PORT=10080"
     command: "/start-master.sh"
     dns:
       - 192.168.1.1  # IP necessary to connect to a database instance external to where the server in which the container is running

   spark-worker:
     image: spark:latest
     environment:
       - "SPARK_MASTER=spark://node-master:7077"
       - "SPARK_WORKER_WEBUI_PORT=8080"
     command: "/start-worker.sh"
     ports:
       - 8080
     network_mode: host
     depends_on:
       - spark-master
     dns:
       -  192.168.1.1  # IP necessary to connect to a database instance external to where the server in which the container is running

** edit **

So I found a way to make it work by commenting few lines out. so why those two are problems?

And even though the container runs fine and connects to the spark-master, it is using some internal ip, as you can see, the 172.18.0.2 is not what we see normally in our network, I think the ip is from docker container not the host

enter image description here

 # network_mode: host
 depends_on:
   - spark-master
 # dns:
 #   -  192.168.1.1  # IP necessary to connect to a database instance external to where the server in which the container is running

Upvotes: 1

Views: 5251

Answers (1)

shanmuga
shanmuga

Reputation: 4499

Try changing the docker network type to macvlan in docker compose file. This should attach the container directly to your network (making it seem like another physical machine) with an ip similar to host. And you can try adding this to your etc hosts.

The proper way to run containers on different machine would be to use network type overlay connect the docker demons on these machines.
Or create a docker swarm cluster using the laptops.

https://docs.docker.com/network/

Upvotes: 1

Related Questions