Reputation: 393
I am using selenium webdriver on chrome to scrape a website's search results, with python 3.6 on Windows 10. When using driver.get(), certain pages do not load, and selenium times out (regardless of how long I set the timeout for). My error is that after timing out, I cannot get another url. My code:
driver = webdriver.Chrome()
for link in link_list:
try:
driver.get(link)
# do some stuff
except:
continue
What happens is that if driver.get(somelink)
throws a timeout exception, then driver.get
will fail every subsequent time. The chrome window itself stalls on somelink
permanently.
It seems like driver
is stopping completely if it throws an exception, regardless of whether or not I catch it. I am not sure if this happens regardless of the type of exception, or if it only happens on timeout. I have a workaround where I close and re-open the window, but it's messy. Is there any way to make the browser continue?
Upvotes: 3
Views: 888
Reputation: 328
This is a long-standing bug in Selenium and has no solution currently.
Upvotes: 1