Reputation: 69
I have been searching loads of forums for using a proxy in python with the selenium library to prevent "max number" timeout with web scraping through selenium.
I found the script below in many forums, but it just doesn't seem to work for me whatsoever... Could anyone please help me and give me some advice on how to implement proxy in chrome through python with selenium.
Thanks a lot!
SCRIPT:
from selenium.webdriver.chrome.options import Options
from selenium import webdriver
chromedriver = directory....
PROXY = "177.202.59.58:8080"
chrome_options = Options()
chrome_options.add_argument('--proxy-server=%s' % PROXY)
chrome = webdriver.Chrome(chromedriver, options=chrome_options)
chrome.get("https://whatismyipaddress.com")
Upvotes: 1
Views: 3938
Reputation: 929
There's nothing wrong with your code. That proxy is just not available/not working anymore. Try to find another proxy that a better uptime. Keep it mind that public proxies have a noticeable latency so the page will load pretty slow.
Upvotes: 1