user2284570
user2284570

Reputation: 3070

How to compress an HTTP response with zlib for HTTP?

I am writing a small server http which always send his generated html page(it discard the client request). This is for a part of a larger program

I want to use content-encoding for minimizing the Local bandwith usage...

I started by trying this example:

#include <sys/types.h>
#include <netinet/in.h>
#include <sys/socket.h>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <time.h>
#include <pthread.h>
#include <zlib.h>
#define CHUNK 1048576

const char * restrict UrList[] = {"An", "example", "of", "random","ASCII string"};

void reponse(int newsockfd)
{
    char * buffer=calloc(sizeof(char),4700);
    srand(time(0));
    sscanf(buffer,"<!DOCTYPE html>\ // hopfully this is a test as I don't hard code like this in real programs...
<html>\
<head>\
    <meta http-equiv=\"Content-Type\" content=\"text/html; charset=utf-8\">\
    <title color=\"white\">%s</title>\
    <body align=\"center\" valign=\"center\" >\
        <button onclick=\"launchFullscreen(document.body);\">%s!</button>\
    </body>\
</html>\0",UrList[(rand()%3)]);
    size_t inputLen = strlen(buffer);
    uLong maxOutputLen = compressBound(inputLen);
    char * destbuffer=malloc(inputLen);
    compress2(destbuffer,&maxOutputLen,buffer,inputLen,Z_BEST_COMPRESSION);
    sprintf(buffer,"200 OK\n\
Server: Serveur personalisé ayant pour but d'empêcher les accès au nom de domaines interdits de manière efficace\n\
X-Powered-By: GCC 4.8.1\n\
Cache-Control: must-revalidate, post-check=0, pre-check=0\n\
ETag: \"1390786010\"\n\
Content-Language: fr\n\
Vary: Accept-Encoding\n\
Content-Type: text/html; charset=UTF-8\n\
\n\
%s \0",destbuffer);
    write(newsockfd,buffer,inputLen);
    free((void *)buffer);
    free((void *)destbuffer);
}

void main()
{
    struct  sockaddr_in6 client_address, server_address;
    int clilen,server_handle=socket(PF_INET6, SOCK_STREAM, 0);

    if (server_handle < 0)
        perror("Unable to create socket.");
    server_address.sin6_family = AF_INET6;
    server_address.sin6_addr = in6addr_any;
    server_address.sin6_port = 0x5000; // port 80
    if ( bind(server_handle, (struct sockaddr *)&server_address, sizeof( server_address )) < 0)
        perror("Unable to bind.");
    listen(server_handle,1310730);
    clilen=sizeof(client_address);
    while(1)
    {
        // variables locales
        pthread_t parallel;

        int newsockfd=accept(server_handle, (struct sockaddr *) &client_address, &clilen);
        pthread_create(&parallel,NULL,reponse,(void *)newsockfd);
    }
}

which don't work, because when I launch telnet 192.168.0.20 80 nothing is printed.

According a step by step exectution in GDB, the buffer have the correct value when write() is called
The main idea is that I can't compress a file directly since I need do in-memory operations on the results in the real program.

Upvotes: 0

Views: 1106

Answers (1)

Adam Rosenfield
Adam Rosenfield

Reputation: 400700

You have a number of problems:

  1. You're calling sscanf() to read data out of an empty buffer right at the start. I think you meant to use sprintf() to write your test data into the buffer (but really, use snprintf() to avoid overflowing your buffer).
  2. You're using the same buffer for input to the compress2() function as you are for output. The documentation is not clear on if this is supported, so I would assume that it's not for safety.
  3. You're lying to zlib about how big your output buffer is. You're telling it that your buffer is compressBound(...) bytes, when in fact it's only 4700 bytes. Because it's impossible to compress every possible data stream, there are some sequences of bytes which get larger after compression, so it's possible for zlib to try to overflow your buffer if you lie to it, and then can cause a segfault. If you at least tell it the truth about how big your buffer is, it will fail with an error in that cause instead of trying to read/write memory out of bounds.
  4. The 2nd parameter to compress2() is a pointer to a variable which is both an input and output parameter. On input, it contains the maximum size of the output buffer, and on output, it receives the actual size of the compressed data. You're passing an integer value (not a pointer) and ignoring the compiler warning about a conversion between integer and pointer types. Listen to your compiler!

Here's how to properly call the compress2() function using separate input and output buffers:

// Error checking elided for expository purposes
const char *inputBuf = "the data to compress...";
size_t inputLen = strlen(inputBuf);

// Compute the maximum size of the compressed output and allocate an output
// buffer for it -- note that this will be LARGER than the input size
uLong maxOutputLen = compressBound(inputLen);
char *outputBuf = malloc(maxOutputLen);

// Compress the data and receive the compressed data size
uLong compressedLen = maxOutputLen;
int result = compress2(outputBuf, &compressedLen, inputBuf, inputLen, Z_BEST_COMPRESSION);

// Use the compressed data
...

free(outputBuf);

Upvotes: 3

Related Questions