Reputation: 65
Im trying upload large file in ASP.net MVC and next save in DB. But when I do this I have "outofmemoryexception". Here I post my file:
[HttpPost]
public void UploadAttachments(int docId, int folderId)
{
if (Request.Files.Count <= 0) return; //throw exception?
for (var i = 0; i < Request.Files.Count; i++)
{
try
{
var file = Request.Files[i];
ProjectAttachmentInput attachment = new ProjectAttachmentInput();
attachment = SetAttachment(attachment, file, docId, folderId);
_documentationAppService.SaveUploadedFile(attachment);
}
catch { }
}
}
Next in SetAttachment():
private ProjectAttachmentInput SetAttachment(ProjectAttachmentInput attachment, HttpPostedFileBase file, int docId, int folderId)
{
attachment.FileName = file.FileName;
attachment.FileContent = ReadFully(file.InputStream);
attachment.ProjectAttachmentId = docId;
attachment.ProjectAttachmentFolderId = folderId;
return attachment;
}
And when i try ReadFully I get OutOfMemoryException...
private static byte[] ReadFully(Stream input)
{
using (MemoryStream ms = new MemoryStream())
{
input.CopyTo(ms);
return ms.ToArray();
}
}
Error is when i try input.CopyTo(ms) I need byte[] from file and next save this in DB. This works if the file is less than 100MB. But when I try 200MB I have exception..
What should change in this code?
And my web.config configuration:
<system.webServer>
<security>
<requestFiltering>
<requestLimits maxAllowedContentLength="2147483648" />
</requestFiltering>
</security>
...
<httpRuntime maxRequestLength="1073741824" targetFramework="4.5.1" />
Upvotes: 1
Views: 1276
Reputation: 2509
This happens because you are trying to load the whole file into an array in memory. The good practice here is to work with streams, not arrays, and persist the file in the database chunks by chunks as a BLOB (if you really want to have your file stored in the DB, which is a debatable choice).
However, in order to save streams as BLOBS, you will have to go through extra steps.
perhaps this article can help you (I don't know what kind of DB you're using)
Upvotes: 1
Reputation: 115691
First, if you expect large files to be uploaded to your web site, you should use streaming IO and never load entire file into memory.
Second, storing files inside a database is not a very good idea, IMO (even with FILESTREAM -- again, this is my very own opinion).
With these two points, here's what I offer you to do:
The "content-addressable" part means that, essentially, whenever you save a file to the filesystem, you first compute hash of the file and then use this hash as a filename. This will get you automatic content deduplication and you'll not depend on the file name.
Upvotes: 1