Reputation: 204
I'm using the web client object to download a file like so:
strm = Client.OpenRead(url);
strm.ReadTimeout = 30000;
bool bFirst = true;
while ((read = strm.Read(buf, 0, 2000)) > 0)
{
fout.Write(buf, 0, read);
}
Where the url points to an S3 bucket. In some cases the download fails with a timeout at exactly 2 GB. Is this a network issue, or is there something I could change in the code?
Any ideas appreciated.
Upvotes: 4
Views: 604
Reputation: 91590
I believe WebClient will read the file into memory, and you're probably running into process size limitations.
What you'll want to use is WebClient.DownloadFile
I believe this will work better for you!
Upvotes: 6