Reputation: 2082
I am trying to upload a 300GB file to Azure Blob Storage. Below is the code that I am using:
// content = array of bytes ~ 300 GB
using (var stream = new MemoryStream(content))
{
var blobRequestOptions = new BlobRequestOptions
{
ParallelOperationThreadCount = Microsoft.WindowsAzure.Storage.Shared.Protocol.Constants.MaxParallelOperationThreadCount
};
blob.UploadFromStream(stream, options: blobRequestOptions);
}
This operation fails with the following message error:
The request body is too large and exceeds the maximum permissible limit
Upvotes: 3
Views: 9208
Reputation: 71118
I believe the issue (per comment confirming older SDK version) is the client SDK version number. Starting with v8.0, large ( 200GB -> 4.77TB) block blob sizes are supported (with blocks now up to 100MB, vs the old 4MB limit). The 50,000-block limit still applies (so, 100MB x 50,000 blocks provides the 4.77TB size).
Prior SDK versions were limited to 4MB blocks, and 200GB block blob size limit.
Larger Block Blobs are supported by the most recent releases of the .NET Client Library (version 8.0.0), the Java Client Library (version 5.0.0), the Node.js Client Library (version 2.0.0) and the AzCopy Command-Line Utility (version 5.2.0). You can also directly use the REST API as always. Larger Block Blobs are supported by REST API version 2016-05-31 and later.
More info here.
Upvotes: 6