Vlad.Z
Vlad.Z

Reputation: 170

Boost Beast: how to produce non-chunked response with a custom body when a content length is not readily available?

I am trying to implement a custom body type that would hold a parsed JSON tree object. The task seems pretty natural, but I can't find a way to generate non-chunked HTTP messages carrying JSON using Beast. I have a REST client/server implemented by wrapping libmicrohttpd and libcurl, but I would prefer moving to Boost Beast instead.

The problem, as I understand it, is that the body type's size(value_type const&) method receives a reference to a body value to be serialized (a JSON tree object in my case), but there is no way to determine the exact length of a stringified JSON without actually stringifying that. However, if I remove the size() method, Beast thinks that I am asking for the chunked transfer encoding. Of course, there is nothing wrong with the chunked encoding per se, but for me, it could mean fixing some automation and monitoring scripts, let alone the integration tests.

What I would like to do is to assign a JSON object to the message I am preparing, and then Beast to ask the writer, not body::size() for the payload size. This seems logical to me, as the actual HTTP message body (serialized JSON) is quite different from a live JSON object in memory, and it is the body::writer who produces the body bytes for transferring. Am I wrong?

Anyway, do you think there is a nice way to solve this problem?

Thank you in advance for your time and effort!

Regards, Vlad

Upvotes: 4

Views: 1323

Answers (1)

Vinnie Falco
Vinnie Falco

Reputation: 5353

Thanks for the kind words, and I'm excited to see the Body customization system getting some attention! Most of what you wrote in the question is correct. If you want Beast to set the Content-Length field when you call message::prepare_payload then you have to provide a correct implementation in Body::size. The BodyWriter is not created until the moment of serialization. The most natural solution to this problem is to avoid providing Body::size and let Beast used the chunked Transfer-Encoding when serializing your body type. I should also note that when you serialize the JSON, it is better if you do it a little at a time in the writer instead of converting the entire thing to a string. That is the purpose of having the writer object, to hold intermediate state and allow incremental serialization. Otherwise, you have to allocate memory for the entire serialized representation, which is less efficient.

Update:

body::size() and message::prepare_payload are not intended to be used the way that you desire. If I am understanding what you want to do, then this function should handle it:

/** Prepare a message with a JSON payload.

    This function accepts ownership of a message with a JSON body
    and converts the JSON to string, returning a new message with
    a string body. The Content-Length field is set on the new
    message. All other fields are transferred over unmodified.
*/
template<
    bool isRequest,
    class Allocator>
message<
    isRequest,
    string_body,
    basic_fields<Allocator>>
prepare(
    message<
        isRequest,
        json_body,
        basic_fields<Allocator>>&& m)
{
    message<
        isRequest,
        string_body,
        basic_fields<Allocator>> result{
            std::move(m.base()),
            json_to_string(m.body())};
    result.prepare_payload();
    return result;
}

Upvotes: 1

Related Questions