Reputation: 338
I'm currently building an application that is, among other things, going to download large files from a FTP server. Everything works fine for small files (< 50 MB) but the files I'm downloading are way bigger, mainly over 2 GB.
I've been trying with a Webclient using DownloadfileAsync() and a list system as I'm downloading these files one after the other due to their sizes.
DownloadClient.DownloadProgressChanged += new DownloadProgressChangedEventHandler(DownloadProgress);
DownloadClient.DownloadFileCompleted += new AsyncCompletedEventHandler(DownloadCompleted);
private void FileDownload()
{
DownloadClient.DownloadFileAsync(new Uri(@"ftp://" + RemoteAddress + FilesToDownload[0]), LocalDirectory + FilesToDownload[0]));
}
private void DownloadProgress(object sender, DownloadProgressChangedEventArgs e)
{
// Handle progress
}
private void DownloadCompleted(object sender, AsyncCompletedEventArgs e)
{
FilesToDownload.RemoveAt(0);
FileDownload();
}
It works absolutely fine this way on small files, they are all downloaded one by one, the progress is reported and DownloadCompleted
fires after each file. This issue I'm facing with big files is that it launches the first download without any issue but doesn't do anything after that. The DownloadCompleted
event never fires for some reasons. It looks like the WebClient
doesn't know that the file has finished to download, which is an issue as I'm using this event to launch the next download in the FilesToDownload list.
I've also tried to do that synchronously using WebClient.DownloadFile
and a for
loop to cycle through my FilesToDownload
list. It downloads the first file correctly and I get an exception when the second download should start: "The underlying connection was closed: An unexpected error occurred on a receive".
Finally, I've tried to go through this via FTP using edtFTPnet but I'm facing download speed issues (i.e. My download goes full speed with the WebClient
and I just get 1/3 of the full speed with edtFTPnet library).
Any thoughts? I have to admit that I'm running out of ideas here.
Upvotes: 3
Views: 4517
Reputation: 338
Forgot to update this thread but I figured how to sort this out a while ago.
The issue was that the Data connection that is opened for a file transfer randomly times out for some reason or is closed by the server before the transfer ends. I haven't been able to figure out why however as there is a load of local and external network interfaces between my computer and the remote server. As it's totally random (i.e the transfer works fine for five files in a row, times out for one file, works fine for the following files etc), the issue may be server or network related.
I'm now catching any FTP exception raised by the FTP client object during the download and issue a REST command with an offset equals to the position in the data stream where the transfer stopped (i.e total bytes amount of the remote file - currently downloaded bytes amount). Doing so allows to get the remaining bytes that are missing in the local file.
Upvotes: 0
Reputation: 18843
public string GetRequest(Uri uri, int timeoutMilliseconds)
{
var request = System.Net.WebRequest.Create(uri);
request.Timeout = timeoutMilliseconds;
using (var response = request.GetResponse())
using (var stream = response.GetResponseStream())
using (var reader = new System.IO.StreamReader(stream))
{
return reader.ReadToEnd();
}
}
Upvotes: 0