kol1991
kol1991

Reputation: 65

How Upload Large File in ASP.net MVC HttpPost

Im trying upload large file in ASP.net MVC and next save in DB. But when I do this I have "outofmemoryexception". Here I post my file:

[HttpPost]
        public void UploadAttachments(int docId, int folderId)
        {
            if (Request.Files.Count <= 0) return;  //throw exception?
            for (var i = 0; i < Request.Files.Count; i++)
            {
                try
                {
                    var file = Request.Files[i];

                    ProjectAttachmentInput attachment = new ProjectAttachmentInput();
                    attachment = SetAttachment(attachment, file, docId, folderId);
                    _documentationAppService.SaveUploadedFile(attachment);

                }
                catch { }
            }
        }

Next in SetAttachment():

private ProjectAttachmentInput SetAttachment(ProjectAttachmentInput attachment, HttpPostedFileBase file, int docId, int folderId)
        {   
            attachment.FileName = file.FileName;
            attachment.FileContent = ReadFully(file.InputStream);
            attachment.ProjectAttachmentId = docId;
            attachment.ProjectAttachmentFolderId = folderId;
            return attachment;
        }

And when i try ReadFully I get OutOfMemoryException...

private static byte[] ReadFully(Stream input)
        {
            using (MemoryStream ms = new MemoryStream())
            {
                input.CopyTo(ms);
                return ms.ToArray();
            }
        }

Error is when i try input.CopyTo(ms) I need byte[] from file and next save this in DB. This works if the file is less than 100MB. But when I try 200MB I have exception..

What should change in this code?

And my web.config configuration:

<system.webServer>
    <security>
    <requestFiltering>
      <requestLimits maxAllowedContentLength="2147483648" />
    </requestFiltering>
  </security>
...
 <httpRuntime maxRequestLength="1073741824" targetFramework="4.5.1" />

Upvotes: 1

Views: 1276

Answers (2)

Fabio Salvalai
Fabio Salvalai

Reputation: 2509

This happens because you are trying to load the whole file into an array in memory. The good practice here is to work with streams, not arrays, and persist the file in the database chunks by chunks as a BLOB (if you really want to have your file stored in the DB, which is a debatable choice).

However, in order to save streams as BLOBS, you will have to go through extra steps.

perhaps this article can help you (I don't know what kind of DB you're using)

https://www.mssqltips.com/sqlservertip/1489/using-filestream-to-store-blobs-in-the-ntfs-file-system-in-sql-server-2008/

Upvotes: 1

Anton Gogolev
Anton Gogolev

Reputation: 115691

First, if you expect large files to be uploaded to your web site, you should use streaming IO and never load entire file into memory.

Second, storing files inside a database is not a very good idea, IMO (even with FILESTREAM -- again, this is my very own opinion).

With these two points, here's what I offer you to do:

  1. Store files in filesystem in a content-addressable manner. This you can do with purely streaming IO, and file size will not be an issue (as long as your disk is not full)
  2. In your database, only keep file hashes

The "content-addressable" part means that, essentially, whenever you save a file to the filesystem, you first compute hash of the file and then use this hash as a filename. This will get you automatic content deduplication and you'll not depend on the file name.

Upvotes: 1

Related Questions