Reputation: 2787
I’m developing an Azure Function to create a CSV file from a list of custom objects, gzip it and upload it to an Azure Storage container with this code:
var blobServiceClient = new BlobServiceClient("My connection string");
var containerClient = blobServiceClient.GetBlobContainerClient("My container name");
var config = new CsvConfiguration(CultureInfo.CurrentCulture) { Delimiter = ";", Encoding = Encoding.UTF8 };
var list = new List<FakeModel>
{
new FakeModel { Field1 = "A", Field2 = "B" },
new FakeModel { Field1 = "C", Field2 = "D" }
};
await using var memoryStream1 = new MemoryStream();
await using var streamWriter = new StreamWriter(memoryStream1);
await using var csvWriter = new CsvWriter(streamWriter, config);
await csvWriter.WriteRecordsAsync(list);
await csvWriter.FlushAsync();
memoryStream1.Position = 0;
await using var memoryStream2 = new MemoryStream();
await using var zip = new GZipStream(memoryStream2, CompressionMode.Compress, true);
await memoryStream1.CopyToAsync(zip);
memoryStream2.Position = 0;
var blockBlob = containerClient.GetBlockBlobClient("test.csv.gz");
await blockBlob.UploadAsync(memoryStream2);
It works. When I download the gzip from the cloud to check it, obviously I get that the file has the correct name, so it’s shown as a GZ file whose name is test.csv.gz
, but when I download it and open it with an extractor, I get that the CSV file inside of it is something strange, like test.csv-3
, that my computer can’t open. Of course, I need it to be a valid *.csv
file. The problem here is that using memory streams I can only give a name to the blob, not for the inner CSV file. How can I do it? Keep in mind that I’d like so use memory streams to keep things simple with Azure Functions’s local storage. Can you help me?
Upvotes: 0
Views: 487
Reputation: 23141
Regarding the issue, please refer to the following code
var blobServiceClient = new BlobServiceClient("My connection string");
var containerClient = blobServiceClient.GetBlobContainerClient("My container name");
var config = new CsvConfiguration(CultureInfo.CurrentCulture) { Delimiter = ";", Encoding = Encoding.UTF8 };
var list = new List<FakeModel>
{
new FakeModel { Field1 = "A", Field2 = "B" },
new FakeModel { Field1 = "C", Field2 = "D" }
};
await using var memoryStream1 = new MemoryStream();
await using var streamWriter = new StreamWriter(memoryStream1);
await using var csvWriter = new CsvWriter(streamWriter, config);
await csvWriter.WriteRecordsAsync(list);
await csvWriter.FlushAsync();
memoryStream1.Position = 0;
var options = new BlockBlobOpenWriteOptions
{
HttpHeaders = new BlobHttpHeaders
{
ContentType = "application/gzip",
},
};
await using var outStream= await containerClient.GetBlockBlobClient("test.csv.gz").OpenWriteAsync(true,options);
await using var zip = new GZipStream(outStream, CompressionMode.Compress, true);
await memoryStream1.CopyToAsync(zip);
await using var input = await containerClient.GetBlockBlobClient("test.csv.gz").OpenReadAsync();
await using var file = File.Create("<file path>");
await using var zip1 = new GZipStream(input, CompressionMode.Decompress, true);
await zip1.CopyToAsync(file);
Upvotes: 1