Tharrma Shastha
Tharrma Shastha

Reputation: 31

python - urllib.error.URLError: <urlopen error timed out>

I am new to python and am using python 3.5.0. I was trying to implement a simple code as follows:

import urllib.request
page = urllib.request.urlopen("http://www.google.com",timeout=20)
text = page.read().decode("utf8")
print(text)

But unexpectedly i was getting the following error:

Traceback (most recent call last):
  File "C:\Users\Dell\AppData\Local\Programs\Python\Python35-32\lib\urllib\request.py", line 1240, in do_open
    h.request(req.get_method(), req.selector, req.data, headers)
  File "C:\Users\Dell\AppData\Local\Programs\Python\Python35-32\lib\http\client.py", line 1083, in request
    self._send_request(method, url, body, headers)
  File "C:\Users\Dell\AppData\Local\Programs\Python\Python35-32\lib\http\client.py", line 1128, in _send_request
    self.endheaders(body)
  File "C:\Users\Dell\AppData\Local\Programs\Python\Python35-32\lib\http\client.py", line 1079, in endheaders
    self._send_output(message_body)
  File "C:\Users\Dell\AppData\Local\Programs\Python\Python35-32\lib\http\client.py", line 911, in _send_output
    self.send(msg)
  File "C:\Users\Dell\AppData\Local\Programs\Python\Python35-32\lib\http\client.py", line 854, in send
    self.connect()
  File "C:\Users\Dell\AppData\Local\Programs\Python\Python35-32\lib\http\client.py", line 826, in connect
    (self.host,self.port), self.timeout, self.source_address)
  File "C:\Users\Dell\AppData\Local\Programs\Python\Python35-32\lib\socket.py", line 707, in create_connection
    raise err
  File "C:\Users\Dell\AppData\Local\Programs\Python\Python35-32\lib\socket.py", line 698, in create_connection
    sock.connect(sa)
socket.timeout: timed out

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "D:\Python\Test_Run.py", line 2, in <module>
    page = urllib.request.urlopen("http://www.google.com",timeout=20)
  File "C:\Users\Dell\AppData\Local\Programs\Python\Python35-32\lib\urllib\request.py", line 162, in urlopen
    return opener.open(url, data, timeout)
  File "C:\Users\Dell\AppData\Local\Programs\Python\Python35-32\lib\urllib\request.py", line 465, in open
    response = self._open(req, data)
  File "C:\Users\Dell\AppData\Local\Programs\Python\Python35-32\lib\urllib\request.py", line 483, in _open
    '_open', req)
  File "C:\Users\Dell\AppData\Local\Programs\Python\Python35-32\lib\urllib\request.py", line 443, in _call_chain
    result = func(*args)
  File "C:\Users\Dell\AppData\Local\Programs\Python\Python35-32\lib\urllib\request.py", line 1268, in http_open
    return self.do_open(http.client.HTTPConnection, req)
  File "C:\Users\Dell\AppData\Local\Programs\Python\Python35-32\lib\urllib\request.py", line 1242, in do_open
    raise URLError(err)
urllib.error.URLError: <urlopen error timed out>

I am connected to my university network. Is that why I get this? What could I do about this?

Upvotes: 3

Views: 8195

Answers (3)

Yagmur SAHIN
Yagmur SAHIN

Reputation: 325

You can wrap the urlopen() call in a loop and retry the request multiple times if a timeout error occurs. This can be useful when dealing with intermittent network issues. Here's an example:

import urllib.request
import urllib.error

url = "http://example.com"
max_retries = 3
retry_count = 0
while retry_count < max_retries:
    try:
        response = urllib.request.urlopen(url, timeout=10)
        # Rest of your code
        break  # Exit the loop if the request is successful
    except urllib.error.URLError as e:
        print("Error:", e)
        retry_count += 1

Upvotes: 0

Stanislav Pankevich
Stanislav Pankevich

Reputation: 11368

The other answer doesn't handle the timeout exception case specifically. To handle the timeout case:

try:
    response = urllib.request.urlopen(request, timeout=10)
    return response
except urllib.error.URLError as e:
    if isinstance(e.reason, socket.timeout):
        # handle timeout...
        pass
    raise e

Upvotes: 2

P0WER1ISA
P0WER1ISA

Reputation: 1

MAX_RETRY = 5

def get_html(html_url, timeout=10, decode='utf-8'):
    for tries in range(MAX_RETRY):
        try:
            with urllib.request.urlopen(html_url, timeout=timeout) as response:
                return response.read().decode(decode)
        except Exception as e:
            logging.warning(str(e) + ',html_url:{0}'.format(html_url))
            if tries < (MAX_RETRY - 1):
                continue
            else:
                print('Has tried {0} times to access url {1}, all failed!'.format(MAX_RETRY, html_url))
                return None

Upvotes: 0

Related Questions