Shubham Tyagi
Shubham Tyagi

Reputation: 838

How to Upload Data to Azure Storage in chunks/blocks?

I'm uploading images to Azure Storage, and need to implement chunking to upload heavy image. But I'm getting exception such as "Offset and length out of bound"

int fileSize = imageModel.ImageByteArray.Length;
int blockSize = 100*1024 , numberOfBlocks = 0;

if (fileSize < blockSize)
    blockSize = fileSize;

if (fileSize % blockSize == 0)
    numberOfBlocks = fileSize / blockSize;
else
    numberOfBlocks = fileSize / blockSize + 1;

byte[] blockArray = new byte[blockSize];

for (int i = 0; i < numberOfBlocks; i++)
{
    Array.Copy(imageModel.ImageByteArray, i * blockSize, blockArray, 0, blockSize);
    await blob.UploadFromByteArrayAsync(blockArray, i, imageModel.ImageByteArray.Length);
    Progress = (float)i / numberOfBlocks;
}

var ResponseURL = blob.Uri.OriginalString;

Can anyone help me with the issue?

Upvotes: 0

Views: 474

Answers (1)

Stephen Cleary
Stephen Cleary

Reputation: 456457

This code always copies blockSize bytes:

Array.Copy(imageModel.ImageByteArray, i * blockSize, blockArray, 0, blockSize);

However, the last block may not be a full blockSize:

if (fileSize % blockSize == 0)
    numberOfBlocks = fileSize / blockSize;
else
    numberOfBlocks = fileSize / blockSize + 1;

To fix this, you'll need to handle the last block specially. It may be a full blockSize bytes or it may be less.

Upvotes: 1

Related Questions