Reputation: 3336
I have Lambda that performs several calls to DynamoDB, creates a big stringified JSON object as a response and passes to the client application via API Gateway. Naturally, API Gateway has "Content Encoding enabled" option set, and all data is passed over the internet in a compressed form.
The problem is that Lambda response itself is not compressed and it hits 6MB response limit. Is it possible to compress Lambda response and then decompress it on the client-side in some natural way?
I've checked node.js libraries like JSZip and ADM Zip and was surprised that despite they allow in-memory output for decompressed data they don't allow in-memory input like string, buffer or smth, only files. Lambda already has several restrictions and surprises related to working with files so I would like to avoid the following redundant workflow:
Is there any more natural way to deal with the issue?
Upvotes: 9
Views: 10861
Reputation: 3336
I've followed the next approach. For backend:
const { deflate, unzip } = require("zlib");
const { promisify } = require("util");
const asyncDeflate = promisify(deflate);
async zip(object) {
return (await asyncDeflate(JSON.stringify(object))).toString("base64");`
}
For front end:
import * as pako from "pako";
export function unzip(base64str: string) {
const strData = atob(base64str);
// Convert binary string to character-number array
const charData = strData.split("").map((x) => { return x.charCodeAt(0); });
// Turn number array into byte-array
const binData = new Uint8Array(charData);
return JSON.parse(pako.inflate(binData, { to: "string" }));
}
So it is rather similar to the recent answer.
Upvotes: 0
Reputation: 193
Although I agree that using pagination or S3 is more appropriate for large amounts of data, you can compress the response returned from your Lambda function in order to avoid hitting the 6MB limit if needed.
Unfortunately, although API Gateway now supports content encodings such as gzip, my understanding is that the Lambda function's response still needs to be below the 6MB limit prior to API Gateway compressing it. I am not 100% sure about this, so someone please correct me if I am wrong.
The good news is that the node.js standard library now has built-in support for gzip via the zlib module. I used this for compressing responses that were just barely above 6MB and reduced the sizes by about ~78%. The downside of this was that I had to decompress the data manually in my Angular client application (I used the pako npm library for this).
Something like below worked well for me (Typescript 4 + node.js 12):
import { APIGatewayEvent, APIGatewayProxyResult } from 'aws-lambda';
import { gzip } from 'zlib';
export const handler = async (event: APIGatewayEvent): Promise<APIGatewayProxyResult> => {
const response = { some: 'data' };
const gzippedResponse = await gzipString(JSON.stringify(response));
return {
statusCode: 200,
body: JSON.stringify({ data: gzippedResponse.toString('base64') }),
};
};
const gzipString = async (input: string): Promise<Buffer> => {
const buffer = Buffer.from(input);
return new Promise((resolve, reject) => gzip(buffer, (err, data) => {
if (err) {
reject(err);
}
resolve(data);
}));
};
Upvotes: 12
Reputation: 51634
There are multiple ways to handle this by slightly changing the architecture:
Return only a subset of the response using paging (this works best if the response contains a list of items that can be spilt into multiple pages).
Store part or all of the response in S3 (either prepared if the response is static or created on the fly if it’s dynamic) and return the object’s URL to the client for subsequent retrieval.
Upvotes: 2