user3002486
user3002486

Reputation: 431

Docker networking cannot connect to local IP in Container

Bit of a newb when it comes to networking...

enp0s8:
      dhcp4: no
      renderer: networkd
      addresses: [192.168.0.50/24]
      gateway4: 192.168.1.1
      nameservers:
        addresses: [8.8.8.8, 8.8.4.4]

This pretty much narrows it down to a networking issue. I have tried running the container using --net=host and it still doesn't work. Confused as to why my Docker container cannot connect to the IP address of the VM but locally on my macbook it runs fine.

Edit:

To give further clarifications on the Python script... The entire script is here:

from kafka.consumer import KafkaConsumer

def create_kafka_consumer(topic, group_id, brokers):
    return KafkaConsumer(topic,
                         bootstrap_servers=brokers,
                         auto_offset_reset='earliest',
                         enable_auto_commit=True,
                         group_id=group_id,
                         value_deserializer=lambda x: x.decode('utf-8'))

GROUP_ID = 'test'
def main():
    topic ='airbnb'
    c = create_kafka_consumer(topic, 'tester', '192.168.0.51:9093')
    for msg in c:
        print(msg)

if __name__ == '__main__':
    main()

The IP address and port is for the Kafka WORKER node (which is running in the Virtual Box), not master node (IP address listed above). I set the IP in the VM and the port was manually set in the docker-compose file.

Successful output example (just prints all the kafka logs):

ConsumerRecord(topic='airbnb', partition=0, offset=169, timestamp=1598650807052, timestamp_type=0, key=None, value='adivinen quien cogio un airbnb para salir de la rutina y tiene que trabajar el weekend? ', headers=[], checksum=None, serialized_key_size=-1, serialized_value_size=111, serialized_header_size=-1) ConsumerRecord(topic='airbnb', partition=0, offset=170, timestamp=1598650821359, timestamp_type=0, key=None, value='This x100%', headers=[], checksum=None, serialized_key_size=-1, serialized_value_size=10, serialized_header_size=-1)

Unsuccessful output:

Traceback (most recent call last): File "tmp/consumer-kafka.py", line 21, in main() File "tmp/consumer-kafka.py", line 16, in main c = create_kafka_consumer(topic, 'tester', '192.168.0.51:9093') File "tmp/consumer-kafka.py", line 11, in create_kafka_consumer value_deserializer=lambda x: x.decode('utf-8')) File "/usr/local/lib/python3.6/site-packages/kafka/consumer/group.py", line 355, in init self._client = KafkaClient(metrics=self._metrics, **self.config) File "/usr/local/lib/python3.6/site-packages/kafka/client_async.py", line 242, in init self.config['api_version'] = self.check_version(timeout=check_timeout) File "/usr/local/lib/python3.6/site-packages/kafka/client_async.py", line 925, in check_version raise Errors.NoBrokersAvailable() kafka.errors.NoBrokersAvailable: NoBrokersAvailable

The Kafka worker netstat -ntl command output: enter image description here

Upvotes: 1

Views: 866

Answers (1)

vgeorge
vgeorge

Reputation: 116

Have you tried with -- link option to communicate with 2 containers while running docker container this will add an entry in /etc/hosts file (or you can try it manually and check)

I think this may help you: https://docs.docker.com/network/links/

youtube: https://www.youtube.com/results?search_query=docker+linking+multiple+containers

if both doesn't suits your fit research with some DNS in docker this might be DNS related :-)

Upvotes: 1

Related Questions