Ikbal Rasimov
Ikbal Rasimov

Reputation: 55

HTTP PUT request by content from inputStream with unknown size and can not set ChunkedStreamingMode in HttpUrlConnection

I'm trying to send an HTTP PUT request to Azure Blob Storage, but ChunkedStreamingMode is not allowed. I'm reading from InputStream with an unknown size. I can separate the PUT Blob request into multiple PUT Block requests (Azure Blob Storage provides a PUT BLOCK operation that stores the single block and in the end, I can build all the blocks to one Blob). Is it a good solution to buffer by 1 MiB in the memory and send it like a block? Or reading from the inputstream and saving it to a temporary file in the local file system, then reading the file and sending it as Block is a better solution?

Upvotes: 0

Views: 478

Answers (1)

Stanley Gong
Stanley Gong

Reputation: 12153

Per my understanding, you want to upload a big file by chunk. I think both of your solutions could work and I'll provide some sample code for your second solution: save the input stream as a temp file and upload it by chunk, just try code below by Azure Blob SDK:

import java.time.Duration;

import com.azure.storage.blob.BlobClient;
import com.azure.storage.blob.BlobServiceClientBuilder;
import com.azure.storage.blob.ProgressReceiver;
import com.azure.storage.blob.models.AccessTier;
import com.azure.storage.blob.models.BlobHttpHeaders;
import com.azure.storage.blob.models.BlobRequestConditions;
import com.azure.storage.blob.models.ParallelTransferOptions;

public class StorageTest {

    public static void main(String[] args) {
        //skip the process about saving a temp file, just leaving its path 
        String tempFilePath = "";

        String connString = "<azure storage connection string>";
        String containerName = "<container name>";
        String destBlobName = "<blob name with path>";

        BlobClient blobClient = new BlobServiceClientBuilder().connectionString(connString).buildClient()
                .getBlobContainerClient(containerName).getBlobClient(destBlobName);
        // 1MB per request in case of consuming too much jvm memory while uploading
        long blockSize = 1024 * 1024;
        ParallelTransferOptions parallelTransferOptions = new ParallelTransferOptions().setBlockSizeLong(blockSize)
                // 2 Concurrency requests as max,you can set more than it to accelerate uploading
                .setMaxConcurrency(2)
                .setProgressReceiver(new ProgressReceiver() {
                    @Override
                    public void reportProgress(long bytesTransferred) {
                        System.out.println("uploaded:" + bytesTransferred);
                    }
                });

        BlobHttpHeaders headers = new BlobHttpHeaders().setContentLanguage("en-US").setContentType("binary");

        blobClient.uploadFromFile(tempFilePath, parallelTransferOptions, headers, null, AccessTier.HOT,
                new BlobRequestConditions(), Duration.ofMinutes(30));

    }

}

I have tested on my side and it works for me to upload a 5GB file. Let me know if you have any more questions.

Upvotes: 1

Related Questions