drmrbrewer
drmrbrewer

Reputation: 12989

node.js: compressing strings

I have a need to cache (in memory) a bunch of fairly lengthy strings. Seems a shame to use memory unnecessarily, particularly when there is a memory quota imposed, so was wondering whether it's sensible to compress those strings before caching, and then decompress after fetching.

Seems like node.js has a built-in zlib module, and although that seems to be aimed more at file/stream compression, may be of use for simple string compression, e.g. from the docs:

var input = '.................................';
zlib.deflate(input, function(err, buffer) {
  if (!err) {
    console.log(buffer.toString('base64'));
  }
});

Any thoughts? Any other libraries or utility functions available?

Upvotes: 3

Views: 8129

Answers (1)

Louis
Louis

Reputation: 132

A question about nodejs compression, here, led me to this page, which lists a bunch of compression libraries.

I'm not sure of the exact context of your project, but if you are afraid that there may be issues regarding memory and performance it appears that this library, called node-snappy may be what you are looking for.

Basically, it takes a string like you want it to, and compresses it more efficiently than zlib, according to the author's benchmarks:

  snappy.compress() x 479 ops/sec ±0.99% (80 runs sampled)
  zlib.gzip() x 289 ops/sec ±1.66% (86 runs sampled)
  snappy.uncompress() x 652 ops/sec ±0.86% (43 runs sampled)
  zlib.gunzip() x 559 ops/sec ±1.65% (64 runs sampled)

Upvotes: 1

Related Questions