ev47295
ev47295

Reputation: 19

Is there a faster way to check the availability of numerous websites

Good day, everyone.

This code is checking availability if the website, but it's loading the whole page, so if I have a list of 100 websites, it will be slow.

My Question is: Is there any way to do it faster?

import requests
user_agent = {'accept': '*/*', 'user-agent': 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/69.0.3497.100 Safari/537.36'}

session = requests.Session()
response = session.get("http://google.com", headers=user_agent, timeout=5)
if response.status_code == 200:
    print("Checked & avaliable")
else:
    print("Not avaliable")

Thanks!

Every Help Will Be Appreciated

Upvotes: 1

Views: 224

Answers (2)

warvariuc
warvariuc

Reputation: 59664

This code is checking availability if the website, but it's loading the whole page

To not load the whole page, you can issue HEAD requests, instead of GET, so you will just check the status. See Getting HEAD content with Python Requests

Another way to make it faster, is to issue multiple requests using multiple threads or asyncio ( https://pawelmhm.github.io/asyncio/python/aiohttp/2016/04/22/asyncio-aiohttp.html ).

Upvotes: 1

Hamza Lachi
Hamza Lachi

Reputation: 1064

You Can Use That:

import urllib.request
print(urllib.request.urlopen("http://www.google.com").getcode())
#output
>>> 200

Upvotes: 1

Related Questions