Tamir Ciprut
Tamir Ciprut

Reputation: 67

python selenium - takes a lot of time when it does not find elements

my code scans a lot of internet pages with chromedriver and searches for the same element in each page with "find_elements_by_xpath"

Lines = driver.find_elements_by_xpath(
                    '//*[@id="top"]/div[contains(@style, "display: block;")]/'
                    'div[contains(@style, "display: block;")]//tbody//a[contains(@title, "Line")]')

When it finds, one or multiple, it works fast and good. But, when the XPath doesn't exist it runs for 6-7 seconds and then moves on.

Can I limit the search for 1 second, And if it doesn't find in a second, just move on? Is there a way to do this?

Upvotes: 5

Views: 1900

Answers (1)

Andersson
Andersson

Reputation: 52665

Try to use ExplicitWait as below:

from selenium.webdriver.support import expected_conditions as EC
from selenium.webdriver.common.by import By
from selenium.webdriver.support.ui import WebDriverWait as wait
from selenium.common.exceptions import TimeoutException

try:
    Lines = wait(driver, 1).until(EC.presence_of_all_elements_located((By.XPATH, '//*[@id="top"]/div[contains(@style, "display: block;")]/'
                'div[contains(@style, "display: block;")]//tbody//a[contains(@title, "Line")]')))
except TimeoutException:
    pass

This should allow you to wait for 1 second until at least one element found and get the list of required WebElements or do nothing otherwise

Upvotes: 3

Related Questions