CSharpDev
CSharpDev

Reputation: 382

Read large file into byte array and encode it to ToBase64String

I have implemented POC to read entire file content into Byte[] array. I am now succeeded to read files whose size below 100MB, when I load file whose size more than 100MB then it is throwing

Convert.ToBase64String(mybytearray) Cannot obtain value of the local variable or argument because there is not enough memory available.

Below is my code that I have tried to read content from file to Byte array

var sFile = fileName;
var mybytearray = File.ReadAllBytes(sFile);

var binaryModel = new BinaryModel
{
    fileName = binaryFile.FileName,
    binaryData = Convert.ToBase64String(mybytearray),
    filePath = string.Empty
};

My model class is as below

public class BinaryModel
{
    public string fileName { get; set; }
    public string binaryData { get; set; }
    public string filePath { get; set; }
}

I am getting "Convert.ToBase64String(mybytearray) Cannot obtain value of the local variable or argument because there is not enough memory available." this error at Convert.ToBase64String(mybytearray).

Is there anything which I need to take care to prevent this error?

Note: I do not want to add line breaks to my file content

Upvotes: 3

Views: 4093

Answers (2)

Paweł Dyl
Paweł Dyl

Reputation: 9143

To save memory you can convert stream of bytes in 3-packs. Every three bytes produce 4 bytes in Base64. You don't need whole file in memory at once.

Here is pseudocode:

Repeat
1. Try to read max 3 bytes from stream
2. Convert to base64, write to output stream

And simple implementation:

using (var inStream = File.OpenRead("E:\\Temp\\File.xml"))
using (var outStream = File.CreateText("E:\\Temp\\File.base64"))
{
    var buffer = new byte[3];
    int read;
    while ((read = inStream.Read(buffer, 0, 3)) > 0)
    {
        var base64 = Convert.ToBase64String(buffer, 0, read);
        outStream.Write(base64);
    }
}

Hint: every multiply of 3 is valid. Higher - more memory, better performance, lower - less memory, worse performance.

Additional info:

File stream is an example. As a result stream use [HttpContext].Response.OutputStream and write directly to it. Processing hundreds of megabytes in one chunk will kill you and your server.

Think about total memory requirements. 100MB in string, leads to 133 MB in byte array, since you wrote about model I expect copy of this 133 MB in response. And remember it's just a simple request. A few such requests could drain your memory.

Upvotes: 2

James Harcourt
James Harcourt

Reputation: 6379

I would use two filestreams - one to read the large file, one to write the result back out.

So in chunks you would convert to base 64 ... then convert the resulting string to bytes ... and write.

    private static void ConvertLargeFileToBase64()
    {
        var buffer = new byte[16 * 1024];
        using (var fsIn = new FileStream("D:\\in.txt", FileMode.Open, FileAccess.Read, FileShare.ReadWrite))
        {
            using (var fsOut = new FileStream("D:\\out.txt", FileMode.CreateNew, FileAccess.Write))
            {
                int read;
                while ((read = fsIn.Read(buffer, 0, buffer.Length)) > 0)
                {
                    // convert to base 64 and convert to bytes for writing back to file
                    var b64 = Encoding.ASCII.GetBytes(Convert.ToBase64String(buffer));

                    // write to the output filestream
                    fsOut.Write(b64, 0, read);
                }

                fsOut.Close();
            }
        }
    }

Upvotes: 0

Related Questions