tjwnuk
tjwnuk

Reputation: 39

Setting the proxy using Selenium and Docker

I have a trouble during using proxy for scraping. I use dockerized Python code and

selenium/standalone-chrome

image. I tried something like this

def get_chrome_driver(proxy):
    proxy = str(proxy)
    chrome_options = webdriver.ChromeOptions()
    chrome_options.add_argument('--proxy=%s' % proxy)
    chrome_options.add_argument("--no-sandbox")
    chrome_options.add_argument("--headless")
    chrome_options.add_argument("--disable-gpu")

    driver = webdriver.Remote(
        command_executor='http://chrome:4444/wd/hub',
        options=webdriver.ChromeOptions()
        )

    return driver

to pass the parameters but the Chrome instance seems to ignore it. I have example scraper scraping IP address from ident.me webpage and it returns my machine's IP.

Upvotes: 0

Views: 157

Answers (1)

Dušan Argaláš
Dušan Argaláš

Reputation: 21

you are saving default options with this line for the driver instance

options=webdriver.ChromeOptions()

you need to set your created options

options=chrome_options

Upvotes: 1

Related Questions