Reputation: 970
I'm trying to get data from inside a div of a public website. The Selenium WebDriver doesn't seem to find any elements. I tried to find elements with id and class even with a XPath but still didn't find anything. I can see the html page code when looking at PageSource, this confirms the driver works. What am I doing wrong here? Selenium V2.53.1 // IEDriverServer Win32 v2.53.1
My code:
IWebDriver driver = new InternetExplorerDriver("C:\\Program Files\\SeleniumWebPagetester");
driver.Navigate().GoToUrl("D:\\test.html");
await Task.Delay(30000);
var src = driver.PageSource; //shows the html page -> works
var ds = driver.FindElement(By.XPath("//html//body")); //NoSuchElementException
var test = driver.FindElement(By.Id("aspnetForm")); //An unhandled exception of type 'OpenQA.Selenium.NoSuchElementException' occurred in WebDriver.dll
var testy = driver.FindElement(By.Id("aspnetForm"), 30); //'OpenQA.Selenium.NoSuchElementException'
var tst = driver.FindElement(By.XPath("//*[@id=\"lx-home\"]"), 30); //'OpenQA.Selenium.NoSuchElementException'
driver.Quit();
Simple HTML page:
<!DOCTYPE html>
<html>
<head>
<meta charset="utf-8" />
</head>
<body>
<form action="#" id="aspnetForm" onsubmit="return false;">
<section id="lx-home" style="margin-bottom:50px;">
<div class="bigbanner">
<div class="splash mc">
<div class="bighead crb">LEAD DELIVERY MADE EASY</div>
</div>
</div>
</section>
</form>
</body>
</html>
Side note, my XPath works perfect with HtmlWeb:
string Url = "D:\\test.html";
HtmlWeb web = new HtmlWeb();
HtmlDocument doc = web.Load(Url);
var element = doc.DocumentNode.SelectNodes("//*[@id=\"lx-home\"]"); //WORKS
Upvotes: 0
Views: 1078
Reputation: 5137
It seems IE parses the local file in different way, so you cannot access DOM. Here are your options:
C:\inetpub\wwwroot
then change your code to open URL instead of localfile: driver.Navigate().GoToUrl("http://localhost/test.html");
Upvotes: 2