Reputation: 190
So I have the following code to verify certain url is correct, I only need 200 response so I made a script works fine but it's too slow (:
import urllib2
import string
def my_range(start, end, step):
while start <= end:
yield start
start += step
url = 'http://exemple.com/test/'
y = 1
for x in my_range(1, 5, 1):
y =y+1
url+=str(y)
print url
req = urllib2.Request(url)
try:
resp = urllib2.urlopen(req)
except urllib2.URLError, e:
if e.code == 404:
print "404"
else:
print "not 404"
else:
print "200"
url = 'http://exemple.com/test/'
body = resp.read()
in this example i assume that i have the following directories in my local host with this results
http://exemple.com/test/2
200
http://exemple.com/test/3
200
http://exemple.com/test/4
404
http://exemple.com/test/5
404
http://exemple.com/test/6
404
so I searched how to do it quicker i found this code :
import urllib2
request = urllib2.Request('http://www.google.com/')
response = urllib2.urlopen(request)
if response.getcode() == 200:
print "200"
it's seems quicker but when i test it with a 404 like (http://www.google.com/111) it's gives me this result :
Traceback (most recent call last):
File "C:\Python27\res.py", line 3, in <module>
response = urllib2.urlopen(request)
File "C:\Python27\lib\urllib2.py", line 126, in urlopen
return _opener.open(url, data, timeout)
File "C:\Python27\lib\urllib2.py", line 400, in open
response = meth(req, response)
File "C:\Python27\lib\urllib2.py", line 513, in http_response
'http', request, response, code, msg, hdrs)
File "C:\Python27\lib\urllib2.py", line 438, in error
return self._call_chain(*args)
File "C:\Python27\lib\urllib2.py", line 372, in _call_chain
result = func(*args)
File "C:\Python27\lib\urllib2.py", line 521, in http_error_default
raise HTTPError(req.get_full_url(), code, msg, hdrs, fp)
urllib2.HTTPError: HTTP Error 404: Not Found
any ideas guys ? and thanks very much for any help :)
Upvotes: 1
Views: 5737
Reputation: 3093
HTTPError
is defined as a family of exceptions so that you can use Try/Except in cases like this:
import urllib2
request = urllib2.Request('http://www.google.com/')
try:
response = urllib.urlopen(request)
# do stuff..
except urllib2.HTTPError: # 404, 500, etc..
pass
You can also add a further except
clause for urllib2.URLError
which covers other (non-HTTP) errors such as timeouts.
Upvotes: 4