user1763510
user1763510

Reputation: 1260

Compress large string into json serializable object for AWS lambda function output

I have a large string that I would like to send as the response for a python lambda function on AWS. The documentation states that the response must be json serializable (must run through json.dumps). My string is roughly 12MB which is larger than the maximum allowed payload on AWS. If I gzip the string it is compressed to roughly 2MB, but then the object is no longer json serialiable.

Here is a minimal example:

import gzip
import json
largeString=b"sdlfkjs dlfkjs dflkj "
compressed=gzip.compress(largeString)
out={}
out['data']=compressed
json.dumps(out)

which returns the expected error:

TypeError: Object of type bytes is not JSON serializable

Upvotes: 0

Views: 1003

Answers (1)

Mark Adler
Mark Adler

Reputation: 112284

Use base64.b85encode(). This will expand the compressed data by 25%, so hopefully 2.5MB still fits.

Note that compressing may not be a robust solution to your problem. If your data gets larger or less compressible, you may still bust the payload limit.

Upvotes: 1

Related Questions