Reputation: 1
I have a very interesting problem that I hope I can solve using .Net, simply I have a zip file in google storage which I want to decompress and move to a different bucket, but I don't have enough memory nor storage to save the whole file and decompress. To solve this issue I have to read the central directory part of the zip file at the end of the file and then do streaming decompress. Did anyone work on a similar issue?
So far I figured to get the last 1024 bytes from the file using the following code:
var fileInfo = _storage.GetObject(BucketName, fileName, new GetObjectOptions { Projection = Projection.Full });
var stream = new MemoryStream();
_storage.DownloadObject(BucketName, fileName, stream, new DownloadObjectOptions { Range = new System.Net.Http.Headers.RangeHeaderValue((long)(fileInfo.Size - 1024), (long)(fileInfo.Size)) });
The problem is I can't read the central directory from this stream:
ZipArchive z = new ZipArchive(stream);
Upvotes: 0
Views: 155
Reputation: 112384
You can try to adapt sunzip to your needs. It reads a zip file as a stream and decompresses it.
Upvotes: 1