Reputation: 1183
My application is built in C# and has been using the Standard gen 2 version of the Azure Blob storage to upload files to containers but we have been having inconsistent behaviour in terms of success in uploading files so we have decided to try the premium version which has reduced latency.
From the documentation, there doesn't seem to be any suggestion that the approach in terms of code should be different. We are however getting 400 bad request with any attempt to upload. We tried using Sas but still experiencing the same challenge. I also tried creating the container manually vs creating dynamically from code but still encountered the same problem.
Here are snippets of both approaches, hopefully someone will be able to point me to the right direction
With sas
// level APIs
AccountSasBuilder sas = new AccountSasBuilder
{
// Allow access to blobs
Services = AccountSasServices.Blobs,
// Allow access to the service level APIs
ResourceTypes = AccountSasResourceTypes.Service,
// Access expires in 1 hour!
ExpiresOn = DateTimeOffset.UtcNow.AddHours(1)
};
// Allow read access
sas.SetPermissions(AccountSasPermissions.All);
// Create a SharedKeyCredential that we can use to sign the SAS token
StorageSharedKeyCredential credential = new StorageSharedKeyCredential(StorageName,StorageKey);
// Build a SAS URI
UriBuilder sasUri = new UriBuilder(StorageURL);
sasUri.Query = sas.ToSasQueryParameters(credential).ToString();
// Create a client that can authenticate with the SAS URI
BlobServiceClient service = new BlobServiceClient(sasUri.Uri);
//var storageAccount = CloudStorageAccount.Parse(ConfigurationManager.AppSettings["AzureWebJobsStorage"]);
//var client = storageAccount.CreateCloudBlobClient();
var blobContainer = service.GetBlobContainerClient(container);
//blobContainer.CreateIfNotExists(PublicAccessType.BlobContainer);
var blockBlob = blobContainer.GetBlobClient(file.Substring(file.LastIndexOf('/') + 1));
using (var fileStream = WaitForFile(file, FileMode.Open, FileAccess.Read, FileShare.Read))
{
var response = blockBlob.UploadAsync(fileStream).Result;
fileStream.Close();
}
Without sas
var client = storageAccount.CreateCloudBlobClient();
var blobContainer = client.GetContainerReference(container);
blobContainer.CreateIfNotExists(BlobContainerPublicAccessType.Blob);
var blockBlob = blobContainer.GetBlockBlobReference(file.Substring(file.LastIndexOf('/') + 1));
using (var fileStream = WaitForFile(file, FileMode.Open, FileAccess.Read, FileShare.Read))
{
blockBlob.UploadFromStream(fileStream);
fileStream.Close();
} ```
Upvotes: 2
Views: 3548
Reputation: 13009
The error code corresponds to
BlockListTooLong Bad Request (400) The block list may not contain more than 50,000 blocks.
The other error codes are listed here:
https://learn.microsoft.com/en-us/rest/api/storageservices/blob-service-error-codes
You need to fix the error, by reducing uploaded block blob size.
Upvotes: 1
Reputation: 26450
Without knowing the exception, I'd say you may fall under the following condition: https://learn.microsoft.com/en-us/rest/api/storageservices/using-blob-service-operations-with-azure-premium-storage
Premium GPv2 accounts do not support block blobs, or the File, Table, and Queue services.
You might want to try with premium BlockBlobStorage
:
However, premium BlockBlobStorage accounts do support block and append blobs.
If not, try to catch the exception. The custom exception contains the HttpStatusMessage
in the RequestInformation
property, where you'll find your bad request case.
Upvotes: 0