Reputation: 544
I have a function in C# that fetches the status of Internet by retrieving a 64b XML from the router page
public bool isOn()
{
HttpWebRequest hwebRequest = (HttpWebRequest)WebRequest.Create("http://" + this.routerIp + "/top_conn.xml");
hwebRequest.Timeout = 500;
HttpWebResponse hWebResponse = (HttpWebResponse)hwebRequest.GetResponse();
XmlTextReader oXmlReader = new XmlTextReader(hWebResponse.GetResponseStream());
string value;
while (oXmlReader.Read())
{
value = oXmlReader.Value;
if (value.Trim() != ""){
return !value.Substring(value.IndexOf("=") + 1, 1).Equals("0");
}
}
return false;
}
using Mozilla Firefox 3.5 & FireBug addon i guessed it normally takes 30ms to retrieve the page however at the very huge 500ms limit it stills reach it often. How can I dramatically improve the performance?
Thanks in advance
Upvotes: 4
Views: 4751
Reputation: 1500893
You're not closing the web response. If you've issued requests to the same server and not closed those responses, that's the problem.
Stick the response in a using
statement:
public bool IsOn()
{
HttpWebRequest request = (HttpWebRequest) WebRequest.Create
("http://" + this.routerIp + "/top_conn.xml");
request.Timeout = 500;
using (HttpWebResponse response = (HttpWebResponse) request.GetResponse())
using (XmlReader reader = XmlReader.Create(response.GetResponseStream()))
{
while (reader.Read())
{
string value = reader.Value;
if (value.Trim() != "")
{
return value.Substring(value.IndexOf("=") + 1, 1) != "0";
}
}
}
return false;
}
(I've made a few other alterations at the same time...)
Upvotes: 8