Reputation: 159
I am trying to download public images of some facebook pages using xpath. I got the xpath from google chrome dev mode (right click and copy xpath).
The xpath I got is: /html/body/div[1]/div/div[1]/div[1]/div[3]/div/div/div[1]/div[1]/div[4]/div/div/div[3]/div/div/div/div[2]
When I try to find it in gogole chrome, it finds the xapth just fine as shown in the image.
But Selenium throws an exception.
selenium.common.exceptions.NoSuchElementException: Message: no such element: Unable to locate element
The code snippet I am using is as follows-
driver.get(page)
sleep(10)
allimgdiv = driver.find_element_by_xpath(
'/html/body/div[1]/div/div[1]/div[1]/div[3]/div/div/div[1]/div[1]/div[4]/div/div/div[3]/div/div/div/div[2]')
Upvotes: 0
Views: 1469
Reputation: 116
This expression returns the link of only posted images in an facebook page type: https://www.facebook.com/username/photos
//h2[.//a[contains(@href,"/photos")]]//following::div//img
Try
images = driver.find_elements_by_xpath('//h2[.//a[contains(@href,"/photos")]]//following::div//img')
for image in images:
linkimage = image.find_element_by_xpath('./@src').text
Upvotes: 2