Reputation: 1952
I am using Python Selenium to crawl several web pages, and I previously know that some of the pages don't have all elements.
I am already waiting for the page to load and using try/except to find the element (class xyz, for example), and it takes 30 seconds to get into the exception.
try:
xyz = driver.find_element_by_css_selector('div.xyz').text
except NoSuchElementException:
print("\tError finding xyz...")
How can I set a smaller timeout, like 5 seconds for example, to be the maximum time for Selenium to look for the element before running the exception and moving on the the next page?
Upvotes: 6
Views: 1629
Reputation: 14145
You can use the WebDriverWait and ExceptionCondition to achieve this.
you need below imports.
from selenium.webdriver.support.ui import WebDriverWait
from selenium.webdriver.common.by import By
from selenium.webdriver.support import expected_conditions as EC
Script:
try:
xyz = WebDriverWait(driver,5).until(EC.presence_of_element_located((By.CSS_SELECTOR,'div.xyz'))).text
except NoSuchElementException:
print("\tError finding xyz...")
Upvotes: 5