Reputation: 410
I'm trying to figure out why my docker container running Django times out when trying to communicate with a docker container running OpenSearch.
I'm testing out DigitalOcean for containerising my web services. It's set to use 2x vCPUs and 4GB RAM.
The Django app container is using gunicorn which is accessed via an nginx reverse proxy.
The OpenSearch container is launched via the developer docker compose file provided on their website.
If I launch all containers, all Django's pages that don't require any interaction with the django-opensearch-dsl package load and operate fine. I can also launch a Django shell and query the database etc.
If I try to run an OpenSearch related command though, such as trying to create an index, it will timeout. For example, running
docker exec -it myapplication-webapp python manage.py opensearch index create
results in
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/urllib3/connection.py", line 174, in _new_conn
conn = connection.create_connection(
File "/usr/local/lib/python3.10/site-packages/urllib3/util/connection.py", line 95, in create_connection
raise err
File "/usr/local/lib/python3.10/site-packages/urllib3/util/connection.py", line 85, in create_connection
sock.connect(sa)
TimeoutError: timed out
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/urllib3/connectionpool.py", line 703, in urlopen
httplib_response = self._make_request(
File "/usr/local/lib/python3.10/site-packages/urllib3/connectionpool.py", line 398, in _make_request
conn.request(method, url, **httplib_request_kw)
File "/usr/local/lib/python3.10/site-packages/urllib3/connection.py", line 244, in request
super(HTTPConnection, self).request(method, url, body=body, headers=headers)
File "/usr/local/lib/python3.10/http/client.py", line 1282, in request
self._send_request(method, url, body, headers, encode_chunked)
File "/usr/local/lib/python3.10/http/client.py", line 1328, in _send_request
self.endheaders(body, encode_chunked=encode_chunked)
File "/usr/local/lib/python3.10/http/client.py", line 1277, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/usr/local/lib/python3.10/http/client.py", line 1037, in _send_output
self.send(msg)
File "/usr/local/lib/python3.10/http/client.py", line 975, in send
self.connect()
File "/usr/local/lib/python3.10/site-packages/urllib3/connection.py", line 205, in connect
conn = self._new_conn()
File "/usr/local/lib/python3.10/site-packages/urllib3/connection.py", line 179, in _new_conn
raise ConnectTimeoutError(
urllib3.exceptions.ConnectTimeoutError: (<urllib3.connection.HTTPConnection object at 0x7f64e5d12f20>, 'Connection to 123.456.78.9 timed out. (connect timeout=120)')
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/requests/adapters.py", line 489, in send
resp = conn.urlopen(
File "/usr/local/lib/python3.10/site-packages/urllib3/connectionpool.py", line 787, in urlopen
retries = retries.increment(
File "/usr/local/lib/python3.10/site-packages/urllib3/util/retry.py", line 592, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='123.456.78.9', port=9200): Max retries exceeded with url: /collections (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at 0x7f64e5d12f20>, 'Connection to 123.456.78.9 timed out. (connect timeout=120)'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/opensearchpy/connection/http_requests.py", line 179, in perform_request
response = self.session.send(prepared_request, **send_kwargs)
File "/usr/local/lib/python3.10/site-packages/requests/sessions.py", line 701, in send
r = adapter.send(request, **kwargs)
File "/usr/local/lib/python3.10/site-packages/requests/adapters.py", line 553, in send
raise ConnectTimeout(e, request=request)
requests.exceptions.ConnectTimeout: HTTPConnectionPool(host='123.456.78.9', port=9200): Max retries exceeded with url: /collections (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at 0x7f64e5d12f20>, 'Connection to 123.456.78.9 timed out. (connect timeout=120)'))
During handling of the above exception, another exception occurred:
Traceback (most recent call last):
File "/opt/app/myapplication/manage.py", line 10, in <module>
execute_from_command_line(sys.argv)
File "/usr/local/lib/python3.10/site-packages/django/core/management/__init__.py", line 419, in execute_from_command_line
utility.execute()
File "/usr/local/lib/python3.10/site-packages/django/core/management/__init__.py", line 413, in execute
self.fetch_command(subcommand).run_from_argv(self.argv)
File "/usr/local/lib/python3.10/site-packages/django/core/management/base.py", line 354, in run_from_argv
self.execute(*args, **cmd_options)
File "/usr/local/lib/python3.10/site-packages/django/core/management/base.py", line 398, in execute
output = self.handle(*args, **options)
File "/usr/local/lib/python3.10/site-packages/django_opensearch_dsl/management/commands/opensearch.py", line 356, in handle
options["func"](**options)
File "/usr/local/lib/python3.10/site-packages/django_opensearch_dsl/management/commands/opensearch.py", line 112, in _manage_index
index.create()
File "/usr/local/lib/python3.10/site-packages/opensearch_dsl/index.py", line 289, in create
return self._get_connection(using).indices.create(
File "/usr/local/lib/python3.10/site-packages/opensearchpy/client/utils.py", line 178, in _wrapped
return func(*args, params=params, headers=headers, **kwargs)
File "/usr/local/lib/python3.10/site-packages/opensearchpy/client/indices.py", line 128, in create
return self.transport.perform_request(
File "/usr/local/lib/python3.10/site-packages/opensearchpy/transport.py", line 409, in perform_request
raise e
File "/usr/local/lib/python3.10/site-packages/opensearchpy/transport.py", line 370, in perform_request
status, headers_response, data = connection.perform_request(
File "/usr/local/lib/python3.10/site-packages/opensearchpy/connection/http_requests.py", line 196, in perform_request
raise ConnectionTimeout("TIMEOUT", str(e), e)
opensearchpy.exceptions.ConnectionTimeout: ConnectionTimeout caused by - ConnectTimeout(HTTPConnectionPool(host='123.456.78.9', port=9200): Max retries exceeded with url: /collections (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at 0x7f64e5d12f20>, 'Connection to 123.456.78.9 timed out. (connect timeout=120)')))
...
Traceback (most recent call last):
File "/usr/local/lib/python3.10/site-packages/requests/adapters.py", line 489, in send
resp = conn.urlopen(
File "/usr/local/lib/python3.10/site-packages/urllib3/connectionpool.py", line 787, in urlopen
retries = retries.increment(
File "/usr/local/lib/python3.10/site-packages/urllib3/util/retry.py", line 592, in increment
raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='123.456.78.9', port=9200): Max retries exceeded with url: /collections (Caused by ConnectTimeoutError(<urllib3.connection.HTTPConnection object at 0x7f64e5d12f20>, 'Connection to 123.456.78.9 timed out. (connect timeout=120)'))
It's essentially saying I can't connect to 123.456.78.9
, which is supposedly the docker host IP address. I use the command /sbin/ip route|awk '/default/ { print $3 }' to find that address (re-numbered for this post)
Note that:
droplet_ip_address:9200
, then it shows OpenSearch related json in the web browsernmap -p 9200 123.456.78.9 -Pn
will showStarting Nmap 7.92 ( https://nmap.org ) at 2023-04-09 06:12 UTC
Nmap scan report for _gateway (123.456.78.9)
Host is up.
PORT STATE SERVICE
9200/tcp filtered wap-wsp
Nmap done: 1 IP address (1 host up) scanned in 2.11 seconds
I've even tried bumping the RAM to 8GB, but that hasn't helped.
On both my dev machine and on the server, I've setup OpenSearch DSL using.
opensearch_host = os.environ.get('OPENSEARCH_HOST')
OPENSEARCH_DSL = {
'default': {
'hosts': f'{opensearch_host}:9200'
},
}
My understanding is that the developer version of the OpenSearch docker compose file doesn't need credentials or certs.
Am I simply using the wrong IP address within a docker system, or somehow firewalling myself?
Upvotes: 2
Views: 682
Reputation: 410
My solution was to create an external docker network and configure both the OpenSearch docker compose file and my website's docker compose file to use it. By default each compose file creates it's own network, which is why my Django app couldn't communicate with OpenSearch.
So first I created an external docker network...
docker network create external-example
Then I modified the default OpenSearch dev docker compose example by removing the lines
networks:
- opensearch-net
and then at the end of the compose file, adding
networks:
default:
name: external-example
external: true
I also added that same network block to the bottom of my website's docker compose file.
From there, I was able to use the service name (as defined the OpenSearch docker compose file) of the port 9200 OpenSearch node as my django-opensearch-dSL host, so in the settings.py
of my Django project, I used
OPENSEARCH_DSL = {
'default': {
'hosts': 'opensearch-node1:9200'
},
}
I think best practices would dictate that that this method of configuration is better than relying on IP address which is what the OP was trying to do.
My solution referenced this blog post, but it was this stackoverflow post that got me on the right track.
Upvotes: 0