Roey
Roey

Reputation: 849

HTTPWebResponse + StreamReader Very Slow

I'm trying to implement a limited web crawler in C# (for a few hundred sites only) using HttpWebResponse.GetResponse() and Streamreader.ReadToEnd() , also tried using StreamReader.Read() and a loop to build my HTML string.

I'm only downloading pages which are about 5-10K.

It's all very slow! For example, the average GetResponse() time is about half a second, while the average StreamREader.ReadToEnd() time is about 5 seconds!

All sites should be very fast, as they are very close to my location, and have fast servers. (in Explorer takes practically nothing to D/L) and I am not using any proxy.

My Crawler has about 20 threads reading simultaneously from the same site. Could this be causing a problem?

How do I reduce StreamReader.ReadToEnd times DRASTICALLY?

Upvotes: 21

Views: 24459

Answers (9)

ashkufaraz
ashkufaraz

Reputation: 5297

Try to add cookie(AspxAutoDetectCookieSupport=1) to your request like this

request.CookieContainer = new CookieContainer();         
request.CookieContainer.Add(new Cookie("AspxAutoDetectCookieSupport", "1") { Domain = target.Host });

Upvotes: 0

Pangamma
Pangamma

Reputation: 807

Why wouldn't multithreading solve this issue? Multithreading would minimize the network wait times, and since you'd be storing the contents of the buffer in system memory (RAM), there would be no IO bottleneck from dealing with a filesystem. Thus, your 82 pages that take 82 seconds to download and parse, should take like 15 seconds (assuming a 4x processor). Correct me if I'm missing something.

____ DOWNLOAD THREAD_____*

Download Contents

Form Stream

Read Contents

_________________________*

Upvotes: 0

Yuriy
Yuriy

Reputation: 84

Thank you all for answers, they've helped me to dig in proper direction. I've faced with the same performance issue, though proposed solution to change application config file (as I understood that solution is for web applications) doesn't fit my needs, my solution is shown below:

HttpWebRequest webRequest;

webRequest = (HttpWebRequest)System.Net.WebRequest.Create(fullUrl);
webRequest.Method = WebRequestMethods.Http.Post;

if (useDefaultProxy)
{
    webRequest.Proxy = System.Net.WebRequest.DefaultWebProxy;
    webRequest.Credentials = CredentialCache.DefaultCredentials;
}
else
{
    System.Net.WebRequest.DefaultWebProxy = null;
    webRequest.Proxy = System.Net.WebRequest.DefaultWebProxy;
}

Upvotes: 0

bisand
bisand

Reputation: 121

I had the same problem, but when I sat the HttpWebRequest's Proxy parameter to null, it solved the problem.

UriBuilder ub = new UriBuilder(url);
HttpWebRequest request = (HttpWebRequest)WebRequest.Create( ub.Uri );
request.Proxy = null;
HttpWebResponse response = (HttpWebResponse)request.GetResponse();

Upvotes: 4

thunder
thunder

Reputation: 11

I found the Application Config method did not work, but the problem was still due to the proxy settings. My simple request used to take up to 30 seconds, now it takes 1.

public string GetWebData()
{
            string DestAddr = "http://mydestination.com";
            System.Net.WebClient myWebClient = new System.Net.WebClient();
            WebProxy myProxy = new WebProxy();
            myProxy.IsBypassed(new Uri(DestAddr));
            myWebClient.Proxy = myProxy;
            return myWebClient.DownloadString(DestAddr);
}

Upvotes: 1

vt2
vt2

Reputation: 11

I had problem the same problem but worst. response = (HttpWebResponse)webRequest.GetResponse(); in my code delayed about 10 seconds before running more code and after this the download saturated my connection.

kurt's answer defaultProxy enabled="false"

solved the problem. now the response is almost instantly and i can download any http file at my connections maximum speed :) sorry for bad english

Upvotes: 1

No Refunds No Returns
No Refunds No Returns

Reputation: 8336

Have you tried ServicePointManager.maxConnections? I usually set it to 200 for things similar to this.

Upvotes: 1

kgriffs
kgriffs

Reputation: 4258

HttpWebRequest may be taking a while to detect your proxy settings. Try adding this to your application config:

<system.net>
  <defaultProxy enabled="false">
    <proxy/>
    <bypasslist/>
    <module/>
  </defaultProxy>
</system.net>

You might also see a slight performance gain from buffering your reads to reduce the number of calls made to the underlying operating system socket:

using (BufferedStream buffer = new BufferedStream(stream))
{
  using (StreamReader reader = new StreamReader(buffer))
  {
    pageContent = reader.ReadToEnd();
  }
}

Upvotes: 16

Matt Brindley
Matt Brindley

Reputation: 9867

WebClient's DownloadString is a simple wrapper for HttpWebRequest, could you try using that temporarily and see if the speed improves? If things get much faster, could you share your code so we can have a look at what may be wrong with it?

EDIT:

It seems HttpWebRequest observes IE's 'max concurrent connections' setting, are these URLs on the same domain? You could try increasing the connections limit to see if that helps? I found this article about the problem:

By default, you can't perform more than 2-3 async HttpWebRequest (depends on the OS). In order to override it (the easiest way, IMHO) don't forget to add this under section in the application's config file:

<system.net>
  <connectionManagement>
     <add address="*" maxconnection="65000" />
  </connectionManagement>
</system.net>

Upvotes: 8

Related Questions