Reputation: 1194
i have the following function will will get me the html source of some website over a proxy, its working fine except some times when server returns 503(server unavailable) or any other exception it never goes into the catch statement.
in the catch statement , the function is supposed calls itself recursively, up to 4 times, if the request keeps failing after 4 tries then null is returned.
private static string GetPageHTML(string link,bool useprx)
{
int tryCount = 0;
WebClient client = new WebClient() { Proxy = new WebProxy(ProxyManager.GetProxy()) { Credentials = new NetworkCredential("xx", "xx") } };
try
{
return client.DownloadString(link);
}
catch (WebException ex)
{
var statuscode = ((HttpWebResponse)ex.Response).StatusCode;
{
if (tryCount == 3)
{
return null;
}
switch (statuscode)
{
case (HttpStatusCode.Forbidden):
tryCount++;
System.Threading.Thread.Sleep(5000);
return GetPageHTML(link, useprx);
case (HttpStatusCode.NotFound):
return null;
case (HttpStatusCode.GatewayTimeout):
tryCount++;
System.Threading.Thread.Sleep(5000);
return GetPageHTML(link, useprx);
case (HttpStatusCode.ServiceUnavailable) :
tryCount++;
System.Threading.Thread.Sleep(5000);
return GetPageHTML(link, useprx);
default: return null;
}
}
}
}
so why it never goes into the catch statement?
Upvotes: 2
Views: 2611
Reputation: 39248
It's probably returning an exception that is not of type WebException. To catch all exceptions under the sun you have to include "catch Exception" as a fallback
Add the fall back catch, after the WebException catch, and debug it to see what type of exception it's really returning
Upvotes: 3