made_in_india
made_in_india

Reputation: 2277

using node-cache module not caching data in AWS lambda

I am using AWS lambda . We tried using node-cache module to some key with expiry for fetching data from another api. Even though the data is being set . When the next request is coming node-cache is not able to cache that data.

here my sample code

//not the original code 

let ClassService = require("./NodeCacheTest");
exports.handler = async (event) => {

     let classService = new ClassService();
     classService.getKey("myKey", (err, value)=> {

         console.log(value);
     });
};

//NodeCacheTest
const NodeCache = require( "node-cache" );
const myCache = new NodeCache();
let CACHEDATA = {
   testing: true,
   time_to_live: 10
};

class NodeCacheTest {

    constructor () {

      let obj = { my: "Special", variable: 42 };
      myCache.get( "myKey", function( err, value ){

           if( err || value == undefined) {

              console.log("No value available!!!");
              myCache.set( "myKey", obj, 10000 );
              console.log("value got cached!!!");

            }
      });

    }

    getKey(key, cb) {

       let obj = { my: "Special", variable: 42 };
       myCache.get( "myKey", function( err, value ){

        if( err || value == undefined) {

           console.log("No value available!!!");
           myCache.set( "myKey", obj, 10000 );
           console.log("value got cached!!!");
           return cb(null,obj);
         } else {

           return cb(null, value);
         }
      });
    }
}
module.exports = NodeCacheTest;

every time i hit the aws lambda endpoint using Jmeter... I see for each call No value available!!! is getting printed. But when I use some global variable like CACHEDATA to implement the same scenario , the value are getting cached. Can anyone can explain me the behavior in this regard.

Upvotes: 4

Views: 3393

Answers (2)

pomSense
pomSense

Reputation: 69

Here is my answer, also from the comment I wrote here:

The lambda cache will be intact as long as the lambda's current execution context is valid. When AWS Lambda executes a function, it creates an 'execution context' which is a temporary runtime environment where external dependencies are initialized. While the function is warm, and if subsequent requests are close enough, the execution context is reused and thus the in memory cached data is available. Once the function goes cold, the execution context is destroyed (along with its in-memory cache). So if your executions are farther apart, in-memory caching won't work as the previous execution context will be destroyed.

An easier way to look at it is while the function is 'warm', the caching will work. When it goes 'cold', the cache is destroyed.

This will work well (and caching should always be done) if you have a lambda that is getting a TON of requests /min.

A caveat here is concurrency, and each concurrent execution will have its own execution context. Meaning if you have 5 lambdas running concurrently, each will have its own execution context and its own cache which isn't shared with the other lambda executions.

Example

If you want to give it a shot, you can try something like

import { LRUCache } from 'lru-cache'

const options = {
  max: 500,

  // for use with tracking overall storage size
  maxSize: 5000,
  sizeCalculation: (value: string, key: string) => {
    // Initial size is zero
    let size = 0

    // Sum up the lengths of each component
    size += new TextEncoder().encode(value).length
    size += new TextEncoder().encode(key).length

    return size
  },

  // how long to live in ms
  ttl: 1000 * 60 * 13,

  // return stale items before removing from cache?
  allowStale: false,

  updateAgeOnGet: false,
  updateAgeOnHas: false
}
const cache = new LRUCache(options)

const userID = '1234'
const time = Date.now()
cache.set(userID, 'value')
cache.get(userID) // "value"


if (cache.get(userID)) {
  const cachedTime = cache.get(userID)
  if (Date.now() - cachedTime > 1000 * 60 * 13) {

    // update record
    
    // update cache
    cache.set(userID, Date.now())

 
  }
}

Upvotes: 1

hephalump
hephalump

Reputation: 6164

You're not able to use a node-cache in Lambda because of the way Lambda works.

The storage that Lambda uses is not persistent. Lambdas run in containers. On occassion you may get container reuse, and cached data is still available but this is very unreliable.

If you're caching data you should look at other services like Elasticache, or you could even potentially user DynamoDB on-demand.

Upvotes: 2

Related Questions