Takamura
Takamura

Reputation: 485

socket.gaierror: [Errno -3] Temporary failure in name resolution

I am requesting an API with this kind of code using the python requests library:

api_request = requests.get(f"http://data.api.org/search?q=example&ontologies=BFO&roots_only=true",
                             headers={'Authorization': 'apikey token=' + '<MY_TOKEN>'})

api_result = api_request.json()
collection = api_result["collection"]
...

This code works fine when I don't request a lot of content but otherwise I'm getting an error. What is weird is that I don't get it each time I request a lot of content. The error message is the following one :

Traceback (most recent call last):
  File "/home/nobu/.local/lib/python3.6/site-packages/urllib3/connection.py", line 160, in _new_conn
    (self._dns_host, self.port), self.timeout, **extra_kw
  File "/home/nobu/.local/lib/python3.6/site-packages/urllib3/util/connection.py", line 61, in create_connection
    for res in socket.getaddrinfo(host, port, family, socket.SOCK_STREAM):
  File "/usr/lib/python3.6/socket.py", line 745, in getaddrinfo
    for res in _socket.getaddrinfo(host, port, family, type, proto, flags):
socket.gaierror: [Errno -3] Temporary failure in name resolution

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/nobu/.local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 677, in urlopen
    chunked=chunked,
  File "/home/nobu/.local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 392, in _make_request
    conn.request(method, url, **httplib_request_kw)
  File "/usr/lib/python3.6/http/client.py", line 1239, in request
    self._send_request(method, url, body, headers, encode_chunked)
  File "/usr/lib/python3.6/http/client.py", line 1285, in _send_request
    self.endheaders(body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.6/http/client.py", line 1234, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.6/http/client.py", line 1026, in _send_output
    self.send(msg)
  File "/usr/lib/python3.6/http/client.py", line 964, in send
    self.connect()
  File "/home/nobu/.local/lib/python3.6/site-packages/urllib3/connection.py", line 187, in connect
    conn = self._new_conn()
  File "/home/nobu/.local/lib/python3.6/site-packages/urllib3/connection.py", line 172, in _new_conn
    self, "Failed to establish a new connection: %s" % e
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7f4bdeca7080>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/nobu/.local/lib/python3.6/site-packages/requests/adapters.py", line 449, in send
    timeout=timeout
  File "/home/nobu/.local/lib/python3.6/site-packages/urllib3/connectionpool.py", line 725, in urlopen
    method, url, error=e, _pool=self, _stacktrace=sys.exc_info()[2]
  File "/home/nobu/.local/lib/python3.6/site-packages/urllib3/util/retry.py", line 439, in increment
    raise MaxRetryError(_pool, url, error or ResponseError(cause))
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='data.api.org', port=80): Max retries exceeded with url: /ontologies/NCIT/classes/http%3A%2F%2Fncicb.nci.nih.gov%2Fxml%2Fowl%2FEVS%2FThesaurus.owl%23C48481/descendants (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f4bdeca7080>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution',))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "new_format.py", line 181, in <module>
    ontology_api(extraction(90))
  File "new_format.py", line 142, in ontology_api
    concept_extraction(collection)
  File "new_format.py", line 100, in concept_extraction
    api_request_tree = requests.get(f"{leaf}", headers={'Authorization': 'apikey token=' + f'{api_key}'})
  File "/home/nobu/.local/lib/python3.6/site-packages/requests/api.py", line 76, in get
    return request('get', url, params=params, **kwargs)
  File "/home/nobu/.local/lib/python3.6/site-packages/requests/api.py", line 61, in request
    return session.request(method=method, url=url, **kwargs)
  File "/home/nobu/.local/lib/python3.6/site-packages/requests/sessions.py", line 530, in request
    resp = self.send(prep, **send_kwargs)
  File "/home/nobu/.local/lib/python3.6/site-packages/requests/sessions.py", line 643, in send
    r = adapter.send(request, **kwargs)
  File "/home/nobu/.local/lib/python3.6/site-packages/requests/adapters.py", line 516, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='data.api.org', port=80): Max retries exceeded with url: /ontologies/NCIT/classes/http%3A%2F%2Fncicb.nci.nih.gov%2Fxml%2Fowl%2FEVS%2FThesaurus.owl%23C48481/descendants (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7f4bdeca7080>: Failed to establish a new connection: [Errno -3] Temporary failure in name resolution',))

I don't know if I get the error because I overrequest the API if it's due to something else. I was not able to find an answer to my problem on Stack Overflow or somewhere else.

Upvotes: 6

Views: 11247

Answers (2)

questionto42
questionto42

Reputation: 9630

None/error output from a query can lead to the error

The error in question vanished when I ran the code in a container with the right parameter. Before, there was an id in one of the parameters that was not in the dataset that I ran the code on. Meaning: a query tried to find an id in a dataset, found nothing, then there was no output, then this legacy container reported:

/usr/local/lib/python2.7/dist-packages/sqlalchemy/orm/properties.py:1029: SAWarning: On Product.product_attribute, 'passive_deletes' is normally configured on one-to-many, one-to-one, many-to-many relationships only. self._check_cascade_settings(self._cascade) Traceback (most recent call last): File "/usr/lib/python2.7/logging/handlers.py", line 556, in emit self.send(s) File "/usr/local/lib/python2.7/dist-packages/graypy/handler.py", line 37, in send DatagramHandler.send(self, s) File "/usr/lib/python2.7/logging/handlers.py", line 607, in send self.sock.sendto(s, (self.host, self.port)) gaierror: [Errno -3] Temporary failure in name resolution

and further down at the end of the run:

    raise e
AutoReconnect: connection closed

Looks as if sqlalchemy cannot handle a None output from a query, which closes the connection, and it tries and tries again to reach it until the connection gets closed after x tries.

Other debugging steps you might give a chance

I am a beginner, do not trust this answer. I still dare it.

"Temporary failure in name resolution" means that you cannot reach a server in your network, be it your host, your dns, where you log, the cloud.

  • First step is to ping each of the servers that your code tries to reach, and the DNS of your network to see whether the names work at all.
  • You might have a forgotten container still running that changes your network (docker ps) or that disturbs a module with its network traffic.

But if it happens only in times, and sort of switches on and off for much of the same dataload, you can try debugging this the plain way:

  • Make an exception and log in it whether you can reach all servers at the moment of the error.
  • Switch off logging by commenting it out in your code. Test the code for a longer time to see whether the error comes up. Logging leads to network traffic. Same for the API call, but I guess that the problem is a Python module that cannot handle race conditions when there are spikes of API calls that get logged.

Upvotes: 0

lam vu Nguyen
lam vu Nguyen

Reputation: 641

with requests.Session() as s:
    s.get('http://google.com')

or

with requests.get('http://httpbin.org/get', stream=True) as r:
    # Do something

this is other way

Python-Requests close http connection

but thanks for session.mount('http://', requests.adapters.HTTPAdapter(max_retries=100))

Upvotes: 2

Related Questions