Convert large files in byte[] to store in SQL Server (C#)

I have a file with a size of approx 1.6 GB. I am aware of storing large file in SQL Server. But I have a requirement to generate the blob for this large file.

I am using following code to generate the byte[] for the large file:

string filePath = Server.MapPath(filename);
string filename = Path.GetFileName(filePath);

FileStream fs = new FileStream(filePath, FileMode.Open, FileAccess.Read);
BinaryReader br = new BinaryReader(fs);

Byte[] bytes = br.ReadBytes((int)fs.Length);

br.Close();
fs.Close();

It is throwing

An exception of type 'System.OutOfMemoryException' occurred in mscorlib.dll but was not handled in user code

I want to know that How can I convert this large file to byte[]?

Upvotes: 0

Views: 1820

Answers (2)

Haitham Shaddad
Haitham Shaddad

Reputation: 4456

You SHOULD not do that, either read the file using chunks or in you case if you want to upload it to SharePoint, just connect the two streams together and SharePoint library will do the rest, ex:

FileStream fileStream = File.OpenRead(filename);
SPFile spfile = theLibrary.Files.Add(fileName, fileStream, true);

This is done in SharePoint server side object model, the same can be done with CSOM

     Microsoft.SharePoint.Client.File.SaveBinaryDirect(clientContext, fileUrl, fileStream, true);

Upvotes: 2

Sunil Kumar
Sunil Kumar

Reputation: 3242

Basically, you can't, not with a byte array.

You can do it for objects larger than 2GB, if you are:

1) Running in 64bit mode on a 64 bit system. 32bit apps cannot address memory bigger than 1.5GB.

2) Are running .NET Framework V4.5 or greater. And

3) Have set gcAllowVeryLargeObjects in your app.config: gcAllowVeryLargeObjects Element

But ... array indexers are still limited to an integer - so you can't do this with byte arrays as your array index would be too big.

You can do it with a stream (~8TB is allowed) but not with an array.

Upvotes: 1

Related Questions