Reputation: 143
I am having issues downloading large files over 1.5 GB from Google Drive using the .NET V3 nuGet package.
Using the code below:
public void DownloadFile(string fileId, string saveTo)
{
var request = service.Files.Get(fileId);
var stream = new System.IO.MemoryStream();
// Add a handler which will be notified on progress changes.
// It will notify on each chunk download and when the
// download is completed or failed.
request.MediaDownloader.ProgressChanged += (Google.Apis.Download.IDownloadProgress progress) =>
{
switch (progress.Status)
{
case Google.Apis.Download.DownloadStatus.Downloading:
{
Console.WriteLine(progress.BytesDownloaded);
break;
}
case Google.Apis.Download.DownloadStatus.Completed:
{
Console.WriteLine("Download complete.");
SaveStream(stream, saveTo);
break;
}
case Google.Apis.Download.DownloadStatus.Failed:
{
Console.WriteLine("Download failed.");
break;
}
}
};
request.Download(stream);
GC.Collect();
}
public void SaveStream(System.IO.MemoryStream stream, string saveTo)
{
using (System.IO.FileStream file = new System.IO.FileStream(saveTo, System.IO.FileMode.Create, System.IO.FileAccess.Write))
{
try
{
stream.WriteTo(file);
}
catch(Exception ex)
{
}
}
}
I get an exception in Google.Apis.Download.IDownloadProgress:
Exception of type 'System.OutOfMemoryException' was thrown.
Stack Trace is:
at System.IO.MemoryStream.set_Capacity(Int32 value)
at System.IO.MemoryStream.EnsureCapacity(Int32 value)
at System.IO.MemoryStream.Write(Byte[] buffer, Int32 offset, Int32 count)
at System.IO.MemoryStream.WriteAsync(Byte[] buffer, Int32 offset, Int32 count, CancellationToken cancellationToken)
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.ValidateEnd(Task task)
at Google.Apis.Download.MediaDownloader.<DownloadCoreAsync>d__31.MoveNext()
The console output looks something like this:
.
.
.
1310720000
1321205760
1331691520
1342177280
Download failed.
It generally fails around the same point in terms of number of bytes downloaded.
Is there any way to either use partial download, or resume download? I can't find much documentation or examples. I can't seem to access request.MediaDownloader.Range to set that. I do not see any way to set a range header. I tried playing around with chunksizes to no avail.
Upvotes: 2
Views: 1192
Reputation: 143
So, after consulting with someone from the GitHub thread here I was able to use a FileStream instead of a MemoryStream and I got the 4GB file to download successfully.
This lead to another issue where I would be able to download 5 or 6 files straight, then the program would hang on the 7th file. The remedy I found for this was to .Flush() and .Close() the FileStream after each download. After I did that, I was able to download 20+ files straight through, 3/4 GB each in size (these are .ISO image files).
public void DownloadFile(string fileId, string saveTo)
{
var request = service.Files.Get(fileId);
FileStream fileStream = new FileStream(saveTo, FileMode.OpenOrCreate, FileAccess.Write);
// Add a handler which will be notified on progress changes.
// It will notify on each chunk download and when the
// download is completed or failed.
request.MediaDownloader.ProgressChanged += (Google.Apis.Download.IDownloadProgress progress) =>
{
switch (progress.Status)
{
case Google.Apis.Download.DownloadStatus.Downloading:
{
Console.WriteLine(progress.BytesDownloaded);
break;
}
case Google.Apis.Download.DownloadStatus.Completed:
{
Console.WriteLine("Download complete.");
fileStream.Flush();
fileStream.Close();
break;
}
case Google.Apis.Download.DownloadStatus.Failed:
{
Console.WriteLine("Download failed.");
break;
}
}
};
request.Download(fileStream);
}
Upvotes: 2