Aufwind
Aufwind

Reputation: 26258

How to determine that pythons urllib2 has queried content through a given proxy?

I figured out how to use a proxy with urllib2:

encoded_params = urllib.urlencode(params)
url = "http://someurl.com"

header = {"User-Agent" : "Mozilla/5.0 (X11; U; Linux i686) Gecko/20071127 Firefox/2.0.0.11"}
proxy = urllib2.ProxyHandler({'http': '193.33.125.217:8080'})
opener = urllib2.build_opener(proxy)
urllib2.install_opener(opener)

request = urllib2.Request(url, headers=header)
response = urllib2.urlopen(request)

I hope the code is correct. Perhaps there is a more elegant way? At least it seems to work so far. Is there something like a log or a dictionary with information about the request I made with urllib2? Something that could give me information about the proxy I used, the parameters, the IP I had while querying, perhaps the port and other metadata?

Upvotes: 0

Views: 136

Answers (1)

Zach Kelling
Zach Kelling

Reputation: 53819

You are doing everything correctly as far as I can tell. One way to test would be to run a simple web server and connect to it from your proxy. Making a simple test web server is easy:

from wsgiref.simple_server import demo_app
from wsgiref.simple_server import make_server

httpd = make_server('0.0.0.0', 8000, demo_app)
print "Serving on port 8000..."
httpd.serve_forever()

Assuming you can connect to the web server externally from wherever you run it (not blocked by firewall, etc) you should be able to verify that the proxy address is echoed when it connects. If you do print response.read() you can look at various WSGI environ details. This is fine for testing, but don't leave the server running forever :)

Upvotes: 1

Related Questions