Reputation: 1235
I am trying to upload a large blob into Azure blob storage but getting exception
System.NotSupportedException: 'Specified method is not supported.'
.NET 9, function app hosted on app service plan.
My code is
[Function("uploadExtratcs")]
public async Task<IActionResult> Run([HttpTrigger(AuthorizationLevel.Function, "post",
Route = "extracts/{systemName}/{extractType}")] HttpRequest req
, string systemName, string extractType)
{
try
{
logger.LogInformation($"Uploading blob {systemName} of type {extractType} has started.");
BlobContainerClient containerClient = blobServiceClient.GetBlobContainerClient("test");
BlockBlobClient blockBlobClient = containerClient.GetBlockBlobClient("test.csv");
int blockSize = 1 * 1024 * 1024;//1 MB Block
int offset = 0;
int counter = 0;
List<string> blockIds = new List<string>();
var bytesRemaining = req.Body.Length; **// Error in thrown on this line**
do
{
var dataToRead = Math.Min(bytesRemaining, blockSize);
byte[] data = new byte[dataToRead];
var dataRead = req.Body.Read(data, offset, (int)dataToRead);
bytesRemaining -= dataRead;
if (dataRead > 0)
{
var blockId = Convert.ToBase64String(Encoding.UTF8.GetBytes(counter.ToString("d6")));
await blockBlobClient.StageBlockAsync(blockId, new MemoryStream(data));
logger.LogInformation(string.Format("Block {0} uploaded successfully.", counter.ToString("d6")));
blockIds.Add(blockId);
counter++;
}
}
while (bytesRemaining > 0);
var headers = new BlobHttpHeaders()
{
ContentType = "application/octet-stream"
};
Response<BlobContentInfo> response = await blockBlobClient.CommitBlockListAsync(blockIds, headers);
await blobStorageService.UploadBlob("pvdb-extracts", $"{systemName}/{extractType}.csv", req.Body);
logger.LogInformation("Uploading blob has finished.");
return new OkObjectResult(new { message = "upload successfully" });
}
catch (System.Exception ex)
{
logger.LogError(ex, "Error uploading file");
throw;
}
}
Update 1: I have made some changes in the code and now I am getting this error
Microsoft.AspNetCore.Server.Kestrel.Core.BadHttpRequestException: 'Request body too large. The max request body size is 30000000 bytes.'
on line
dataRead = await req.Body.ReadAsync(data, offset, dataToRead);
So look like the blob stream I am passing to the function app is quite large and that limit needs to be changed somewhere. I did not get that error on a smaller stream.
Update 2:
I have increased the limit and looks like I don't need to splint request body into chunks and it works with the standard SDK method
services.Configure<KestrelServerOptions>(options =>
{
options.Limits.MaxRequestBodySize = 300000000;
});
and now I can use following SDK method to upload the whole stream
Response<BlobContentInfo> response = await
blobClient.UploadAsync(blobContent, true);
Looks like there is no need to split into chunks at least in my case.
Update 3:
I am able to test is locally after setting the paylaod size to 300mb but when I deploy I am getting folowig error:
Exception while executing function: Functions.uploadExtratcs Exception binding parameter 'req' Request body too large. The max request body size is 100 bytes.
Upvotes: 0
Views: 82