Reputation: 47
i m trying to go through a list of ahref links in a web page and click on each link at a time. i managed to list all the links and display them using the GetAttribute method, but i m struggling to click on each link. Any advise please or similar example in C# with using selenium would be appreciated?
The reason i need to do this is to try and check that all the links can be clicked and don't return a page not found error
Thank in advance
List<IWebElement> item = new List<IWebElement>();
foreach (IWebElement item in OpenPageSteps.driver1.FindElements(By.TagName("a")))
{
if (item.Displayed)
{
Console.WriteLine(item.GetAttribute("href"));
}
}`
Upvotes: 0
Views: 1059
Reputation: 31
Selenium is not the best tool to check the links for 404 error. You can gather links with Selenium, but then just use HttpClient to check them. It will be much more efficient.
var client = new HttpClient();
try
{
var response = await client.GetAsync(url);
if (response.IsSuccessStatusCode)
{
// normal link
}
else
{
// Something wrong with link here (500 etc)
}
}
catch (Exception e)
{
// .. Network related issues (site does not exist etc.)
}
Upvotes: 1
Reputation: 43
Your objective is to check whether the link is working fine and it does not return page not found error. One simple way without selenium is we can use WebRequest and WebResponse classes in .NET and C# to call a link and read its content.
public void callurl(string url)
{
WebRequest request = HttpWebRequest.Create(url);
WebResponse response = request.GetResponse();
StreamReader reader = new
StreamReader(response.GetResponseStream());
string urlText = reader.ReadToEnd(); // it takes the
response from your url. now you can use as your need
Response.Write(urlText.ToString());
}
Call this function inside the loop and check whether it's return the page content or not.
Upvotes: 1