Reputation: 150
I know there's a ton of questions about zlib/gzip etc but none of them quite match what I'm trying to do (or at least I haven't found it). As a quick overview, I have a C# server that decompresses incoming strings using a GZipStream. My task is to write a C++ client that will compress a string compatible with GZipStream decompression.
When I use the code below I get an error that says "The magic number in GZip header is not correct. Make sure you are passing in a GZip stream." I understand what the magic number is and everything, I just don't know how to magically set it properly.
Finally, I'm using the C++ zlib nuget package but have also used the source files directly from zlib with the same bad luck.
Here's a more in depth view:
The server's function for decompression
public static string ReadMessage(NetworkStream stream)
{
byte[] buffer = new byte[512];
StringBuilder messageData = new StringBuilder();
GZipStream gzStream = new GZipStream(stream, CompressionMode.Decompress, true);
int bytes = 0;
while (true)
{
try
{
bytes = gzStream.Read(buffer, 0, buffer.Length);
}
catch (InvalidDataException ex)
{
Console.WriteLine($"Busted: {ex.Message}");
return "";
}
// Use Decoder class to convert from bytes to Default
// in case a character spans two buffers.
Decoder decoder = Encoding.Default.GetDecoder();
char[] chars = new char[decoder.GetCharCount(buffer, 0, bytes)];
decoder.GetChars(buffer, 0, bytes, chars, 0);
messageData.Append(chars);
Console.WriteLine(messageData);
// Check for EOF or an empty message.
if (messageData.ToString().IndexOf("<EOF>", StringComparison.Ordinal) != -1)
break;
}
int eof = messageData.ToString().IndexOf("<EOF>", StringComparison.Ordinal);
string message = messageData.ToString().Substring(0, eof).Trim();
//Returns message without ending EOF
return message;
}
To sum it up, it accepts a NetworkStream in, gets the compressed string, decompresses it, adds it to a string, and loops until it finds <EOF>
which is removed then returns the final decompressed string. This is almost a match from the example off of MSDN.
Here's the C++ client side code:
char* CompressString(char* message)
{
int messageSize = sizeof(message);
//Compress string
z_stream zs;
memset(&zs, 0, sizeof(zs));
zs.zalloc = Z_NULL;
zs.zfree = Z_NULL;
zs.opaque = Z_NULL;
zs.next_in = reinterpret_cast<Bytef*>(message);
zs.avail_in = messageSize;
int iResult = deflateInit2(&zs, Z_BEST_COMPRESSION, Z_DEFLATED, (MAX_WBITS + 16), 8, Z_DEFAULT_STRATEGY);
if (iResult != Z_OK) zerr(iResult);
int ret;
char* outbuffer = new char[messageSize];
std::string outstring;
// retrieve the compressed bytes blockwise
do {
zs.next_out = reinterpret_cast<Bytef*>(outbuffer);
zs.avail_out = sizeof(outbuffer);
ret = deflate(&zs, Z_FINISH);
if (outstring.size() < zs.total_out) {
// append the block to the output string
outstring.append(outbuffer,
zs.total_out - outstring.size());
}
} while (ret == Z_OK);
deflateEnd(&zs);
if (ret != Z_STREAM_END) { // an error occurred that was not EOF
std::ostringstream oss;
oss << "Exception during zlib compression: (" << ret << ") " << zs.msg;
throw std::runtime_error(oss.str());
}
return &outstring[0u];
}
Long story short here, it accepts a string and goes through a pretty standard zlib compression with the WBITS being set to wrap it in a gzip header/footer. It then returns a char* of the compressed input. This is what is sent to the server above to be decompressed.
Thanks for any help you can give me! Also, let me know if you need any more information.
Upvotes: 2
Views: 1986
Reputation: 3560
In your CompressString
function you return a char*
obtained from the a locally declared std::string
. The string will be destroyed when the function returns which will release the memory at the pointer you've returned.
It's likely that something is being allocated to the this memory region and writing over your compressed data before it gets sent.
You need to ensure the memory containing the compressed data remains allocated until it has been sent. Perhaps by passing a std::string&
into the function and storing it in there.
An unrelated bug: you do char* outbuffer = new char[messageSize];
but there is no call to delete[]
for that buffer. This will result in a memory leak. As you're throwing exceptions from this function too I would recommend using std::unique_ptr<char[]>
instead of trying to manually sort this out with your own delete[]
calls. In fact I would always recommend std::unique_ptr
instead of explicit calls to delete
if possible.
Upvotes: 4