Renya Karasuma
Renya Karasuma

Reputation: 1068

Efficient and Secure Handling of Large ZIP Streams in Azure Blob Storage

I need to upload a stream from the browser's network. The current approach involves uploading the stream for processing, and only if it's valid, it proceeds with the regular flow. Since the stream can only be read once, I considered two approaches:

Copy the stream in memory: This is not ideal due to the potential for high memory usage and the risk of DoS attacks, especially with large files.

Read from the file system after uploading: This involves writing the stream to the file system and then reading it from there to avoid keeping it in memory.

However, I am concerned that even if I write to the file system and then read from it, the reading will still involve memory usage. Also, it is not performant.

Are there any better approaches to overcome this problem while minimizing memory usage and ensuring security?

Upvotes: 0

Views: 99

Answers (1)

Venkatesan
Venkatesan

Reputation: 10455

Are there any better approaches to overcome this problem while minimizing memory usage and ensuring security?

I agree with Mark Adler's and Andrew's comment,Using the ZipFlow approach alongside a SAS token to upload directly to Azure Blob Storage can be an effective solution to minimize memory usage while streaming the file from the browser.

You can create Azure storage account SAS token using below code.

Code:

 public static string CreateAccountSasToken(string accountName, string accountKey)
    {
        var sharedKeyCredential = new StorageSharedKeyCredential(accountName, accountKey);
        // Create Account SAS
        AccountSasBuilder sasBuilder = new AccountSasBuilder
        {
            ExpiresOn = DateTimeOffset.UtcNow.AddHours(1), // Set expiration time
            Services = AccountSasServices.Queues,         // Enable queue service
            ResourceTypes = AccountSasResourceTypes.Service | AccountSasResourceTypes.Container | AccountSasResourceTypes.Object, 
        };

        sasBuilder.SetPermissions(AccountSasPermissions.Add | AccountSasPermissions.Create | AccountSasPermissions.Read | AccountSasPermissions.Write);
        string sasToken = sasBuilder.ToSasQueryParameters(sharedKeyCredential).ToString();
        return sasToken;
    }

Now you can use JavaScript to upload the file directly to Azure Blob Storage using the SAS token.

  • You can use a Blob Trigger Azure Function to scan files automatically after they are uploaded. This allows you to process the file as soon as it is available in Azure Blob Storage.
  • The function can read the file as a stream and then apply the ZipFlow method to scan the file without needing to load it entirely into memory.

Reference:

Upvotes: 1

Related Questions