MoorzTech
MoorzTech

Reputation: 380

Python 'requests' with proxy not working / leaking IP?

(Strongly) related to: Requests Proxy not Working I'm new to python so apologies for blunders I might've made.

Whatever proxy I choose, whenever I check my IP on http://www.whatismyproxy.com/ it displays my real IP and a "There may be a proxy"-note. I've tried dozens of different proxies from different sources...always declared as "elite". So apparently, the issue is with my code. Here goes nothing:

from lxml import html
import requests
base_url = 'http://www.whatismyproxy.com/'

def pagefetch(url):
    httpproxy = "http://111.13.109.51"
    proxydict = {
                "http_proxy": httpproxy
                }

    page = requests.get(url, proxies=proxydict)
    return page

def scrape1(base_url):
    page = pagefetch(base_url)
    tree = html.fromstring(page.text)
    head1 = tree.xpath('//p[@class="h1"]/text()')
    return head1

txt1 = scrape1(base_url)
print txt1

This is a simplified version of a scraper I'm currently working on, thus its slightly clunky. To clarify, I have no issues connecting to the proxy(s). Thanks in advance =) I'm using ubuntu 14.04 btw.

Upvotes: 0

Views: 775

Answers (1)

Ian Stapleton Cordasco
Ian Stapleton Cordasco

Reputation: 28807

Your proxydict is wrong. It should be

proxydict = {
    'http': httpproxy
}

If instead you wanted to specify your proxy as a parameter to your script, you would do this:

http_proxy='http://111.13.109.51' python my_script.py

Upvotes: 2

Related Questions