Reputation: 8068
Is it possible to have HAProxy chain
a frontend request to multiple backends before responding to the client?
I would like to implement:
+---+ 1 +---------+
| +----------> |
| H | | Auth |
| A <--headers-+ |
+--------+ | P | 2 +---------+
| client +-----> r |
| <-----| o | +---------+
+--------+ 4 | x | | |
| y +--headers-> API |
| | 3 | |
+---+ +---------+
2.
If this is possible, what would the configuration look like?
Upvotes: 0
Views: 479
Reputation: 179374
This isn't possible.
HAProxy buffers only the headers and a few additonal KB of request body (if there is one) at the beginning of a request, and frees that buffer as soon as it has been sent to the back-end. Except for things it specifically extracts and retains from the request (like capture.req.uri
), it has forgotten the bytes of the request (and freed the buffer) by the time a response is returned. Hence the reason you can't use most layer 7 request fetches during response processing (you have to stash values in variables or use request header captures, which allocate chunks of memory for each header).
HAProxy has introduced support for something called the Stream Processing Offload Engine which is currently targeted towards querying an external data source and setting internal variables (like a dynamic http-request set-var ...
operation) but doesn't appear to support sending any payload (though it might be possible to use fetches to capture a small amount of the payload). It uses a binary protocol, rather than HTTP.
The Lua integration can also sniff the initial buffer, send data to an external system, and react to the response, but is limited in what it can do as far as actually sending enough of the request to be useful for this application.
Upvotes: 2