LA Developers
LA Developers

Reputation: 11

Checking website response within x seconds

Good day the problem I am facing is that I want to check if my website is up or not this is the sample pseudo code

Check(website.com)
if checking_time > 10 seconds:
  print "No response Recieve"
else:
  print "Site is up"

I already try the code below but not working

try:
    response = urllib.urlopen("http://insurance.contactnumbersph.com").getcode()
    time.sleep(5)
    if response == "" or response == "403":
        print "No response"
    else:
        print "ok"

Upvotes: 1

Views: 429

Answers (2)

Sam Mason
Sam Mason

Reputation: 16184

note that the timeout that gets passed around by urllib applies to the "wrong thing". that is each individual network operation (e.g. hostname resolution, socket connection, sending headers, reading a few bytes of the headers, reading a few more bytes of the response) each get this same timeout applied. hence passing a "timeout" of 10 seconds could allow a large response to continue for hours

if you want to stick to built in Python code then it would be nice to use a thread to do this, but it doesn't seem to be possible to cancel running threads nicely. an async library like trio would allow better timeout and cancellation handling, but we can make do by using the multiprocessing module instead:

from urllib.request import Request, urlopen
from multiprocessing import Process
from time import perf_counter

def _http_ping(url):
    req = Request(url, method='HEAD')
    print(f'trying {url!r}')
    start = perf_counter()
    res = urlopen(req)
    secs = perf_counter() - start
    print(f'response {url!r} of {res.status} after {secs*1000:.2f}ms')
    res.close()

def http_ping(url, timeout):
    proc = Process(target=_http_ping, args=(url,))
    try:
        proc.start()
        proc.join(timeout)
        success = not proc.is_alive()
    finally:
        proc.terminate()
        proc.join()
        proc.close()
    return success

you can use https://httpbin.org/ to test this, e.g:

http_ping('https://httpbin.org/delay/2', 1)

should print out a "trying" message, but not a "response" message. you can adjust the delay time and timeout to explore how this behaves...

note that this spins up a new process for each request, but as long as you're doing this less than a thousand pings a second it should be OK

Upvotes: 0

Narasimha Prasanna HN
Narasimha Prasanna HN

Reputation: 662

If the website is not up and running, you will get connection refused error and actually doesn't return any status code. So, you can catch the error in python with simple try: and except: blocks.

import requests
URL = 'http://some-url-where-there-is-no-server'
try:
  resp = requests.get(URL)
except Exception as e:
  # handle here
  print(e) # for example

You can also check repeatedly 10 times, each per second to check if there is an exception, if there is you will check again

import requests
URL = 'http://some-url'

canCheck = False
counts = 0
gotConnected = False

while counts < 10 :
   try:
     resp = requests.get(URL)
     gotConnected = True
     break
   except Exception as e:
     counts +=1
     time.sleep(1)

The result will be available in gotConnected flag, which you can use later to handle appropriate actions.

Upvotes: 1

Related Questions