Grace
Grace

Reputation: 1265

Unzip buffer with large data length is crashing

This is the function I am using to unzip buffer.

string unzipBuffer(size_t decryptedLength, unsigned char * decryptedData)
{
    z_stream stream;
    stream.zalloc = Z_NULL;
    stream.zfree = Z_NULL;
    stream.avail_in = decryptedLength;
    stream.next_in = (Bytef *)decryptedData;
    stream.total_out = 0;
    stream.avail_out = 0;
    size_t dataLength = decryptedLength* 1.5;
    char c[dataLength];

    if (inflateInit2(&stream, 47) == Z_OK)
    {
        int status = Z_OK;
        while (status == Z_OK)
        {
            if (stream.total_out >= dataLength)
            {
                dataLength += decryptedLength * 0.5;
            }

            stream.next_out = (Bytef *)c + stream.total_out;

            stream.avail_out = (uint)(dataLength - stream.total_out);

            status = inflate (&stream, Z_SYNC_FLUSH);

        }
        if (inflateEnd(&stream) == Z_OK)
        {
            if (status == Z_STREAM_END)
            {
                dataLength = stream.total_out;
            }
        }
    }
    std::string decryptedContentStr(c, c + dataLength);
    return decryptedContentStr;
}

And it was working fine until today when I realized that it crashes with large data buffer (Ex: decryptedLength: 342792) on this line:

status = inflate (&stream, Z_SYNC_FLUSH);

after one or two iterations. Can anyone help me please?

Upvotes: 2

Views: 362

Answers (2)

honk
honk

Reputation: 9743

If your code generally works correctly, but fails for large data sets, then this could be due to a stack overflow as indicated by @StillLearning in his comment.

A usual (default) stack size is 1 MB. When your decryptedLength is 342,792, then you try to allocate 514,188 byte in the following line:

char c[dataLength];

Together with other allocations in your code (and finally in the inflate() function), this might already be too much. To overcome this problem, you should allocate the memory dynamically:

char* c = new char[dataLength];

If you so this, then please do not forget to release the allocated memory at the end of your unzipBuffer() function:

delete[] c;

If you forget to delete the allocated memory, then you will have a memory leak.

In case this doesn't (fully) solve your problem, you should do it anyway, because for even larger data sets your code will break for sure due to the limited size of the stack.


In case you need to "reallocate" your dynamically allocated buffer in your while() loop, then please take a look at this Q&A. Basically you need to use a combination of new, std::copy, and delete[]. However, it would be more appropriate if your exchange your char array with a std::vector<char> or even std::vector<Bytef>. Then you would be able enlarge your buffer easily by using the resize() function. You can directly access the buffer of a vector by using &my_vector[0] in order to assign it to stream.next_out.

Upvotes: 2

Mark Adler
Mark Adler

Reputation: 112239

c is not going to get bigger just because you increase datalength. You are probably overwriting past the end of c because your initial guess of 1.5 times the compressed size was wrong, causing the fault.

(It might be a stack overflow as suggested in another answer here, but I think that 8 MB stack allocations are common nowadays.)

Upvotes: 1

Related Questions