wuno
wuno

Reputation: 9875

Returning asynchronous data then exporting it synchronously in Node.js

Background

I am returning data from AWS Secrets Manager and using the aws-sdk to do so. Earlier I asked a question about how to correctly return the data and export it since the exported object never had the data resolved by the time the export was imported somewhere else. This caused me to get a bunch of undefined.

After solving that problem it was determined that the way to handle this was to wrap the aws-sdk function in a promise, then call the promise in another file with async await. This causes me issues.

Example

If I request and return the data from AWS like this,

let secrets = {
  jwtHash: 10,
};

const client = new AWS.SecretsManager({
  region: region
});

const promise = new Promise((resolve, reject) => {
  client.getSecretValue({ SecretId: secretName }, async (err, data) => {
    if (err) {
      reject(err);
    } else {
      const res = await JSON.parse(data.SecretString);
      secrets.dbUsername = res.username;
      secrets.dbPassword = res.password;
      secrets.dbHost = res.host;
      secrets.dbPort = res.port;
      secrets.dbDatabase = res.dbname;
      resolve(secrets);
    }
  });
});

module.exports = promise;

Then I can import it in another file and use the data like this,

const promise = require('../secrets');

(async () => {
  const secrets = await promise;
  // use secrets here
})();

Now let's say in that file where I am trying to use secrets I have something like this,

const pool = new Pool({
  user: secrets.dbUsername,
  host: secrets.dbHost,
  database: secrets.dbDatabase,
  password: secrets.dbPassword,
  port: secrets.dbPort
});

pool.on('error', err => {
  console.error('Unexpected error on idle client', err);
  process.exit(-1);
});

module.exports = pool;

If I wrap the pool function in the async self invoking function I have trouble exporting it so it can be used anywhere in my app when I need a database connection. Similar, I have many functions throughout my application that need access to the secret data. If I were to walk through the application wrapping all of my code in async functions it would continue to cause more of these difficulties.

Question

It seems to me the best solution here would be to return the data asynchronously and once it has resolved, export it synchronously.

How can I achieve such a task in this scenario?

A win here would be,

  1. Make the request in /secrets/index.js
  2. Build the secrets object in the same file
  3. Export secrets as an object that can easily be imported anywhere else in my application without the need for asynchronous functions.

Example of How I Would Like to Use This

const secrets = require('../secrets');

const pool = new Pool({
      user: secrets.dbUsername,
      host: secrets.dbHost,
      database: secrets.dbDatabase,
      password: secrets.dbPassword,
      port: secrets.dbPort
    });

    pool.on('error', err => {
      console.error('Unexpected error on idle client', err);
      process.exit(-1);
    });

    module.exports = pool;

Upvotes: 4

Views: 2511

Answers (3)

JonShipman
JonShipman

Reputation: 695

One thing I do (especially when working with a large application that imports static variables that have been moved to a database) is load that file via a function and that function populates an export.

// config.js
const exports = {};

export async function populate() {
    const RUNTIMEVARS = await what_you_do_to_get_vars();
    
    for (const config of RUNTIMEVARS) {
        exports[config.key] = exports[config.data];
    }
    
    // for anything needing the config in the bootstrap.
    return exports;
}

export default exports;

Then in the bootstrap:

// bootstrap.js

import './database-connection.js'; // important to have no internal dependencies.

(async() => {
    const { populate } = await import('./config.js');
    await populate();
    
    import('./application/index.js');
})()

Now any file inside your application can import config from '../config.js' as though it were statically declared as we populated the object in the populate function in the bootstrap.

Upvotes: 0

TJBlackman
TJBlackman

Reputation: 2313

I would suggest doing everything in 1 file, and then instead of exporting the object you create, export a function that returns the object. The function will always have access to the must up-to-date version of the object, and you can call it from any file to access the same object.

Example: Create two files in a folder. In the first file, we will do this:

  • Define a value.
  • Set a timeout to change the value after some time
  • Export the value itself
  • Export a function that returns the value

values.js

let x = 0 ; // set initial value
setTimeout(() => { x = 5; }, 2000); // sometime later, value will change

const getValueOfX = () => { return x; }; 

module.exports = {
    x: x,
    getValueOfX: getValueOfX
}; 

Now in the other file, we just import the two exports from the previous file (we put them both in an object for easy exporting). We can then log them out, wait for some time to pass, and log them out again.

index.js

let values = require('./values');

console.log(`Single value test. x = ${values.x}`);
console.log(`Function return value test. x = ${values.getValueOfX()}`);
setTimeout(() => { console.log(`Single value test. x = ${values.x}`); }, 4000);
setTimeout(() => { console.log(`Function return value test. x = ${values.getValueOfX()}`); }, 4000);

To run the code, just open your Terminal or Command Prompt and, from the same directory as these two files, run node index.js

You'll see that when just the value (object, array, w/e) is exported, it is exported as-is when the export runs - almost always before the API call is finished.

BUT - If you export a function that returns the value (object, array, w/e), then that function will retrieve the up-to-date version of the value at the time it is called! Great for API calls!

so your code might look like this:

let secrets = { jwtHash: 10 };
const client = new AWS.SecretsManager({
    region: region
});

let pool = null; 

client.getSecretValue({ SecretId: secretName }, async (err, data) => {
    if (err) {
        reject(err);
    } else {
        const res = await JSON.parse(data.SecretString);
        pool = new Pool({
            user: res.username,
            host: res.host
            database: res.dbname
            password: res.password
            port: res.port
        }); 
        pool.on('error', err=> {
            console.error('Unexpected error on idle client', err);
            process.exit(-1);
        }); 
    }
});

module.exports = function(){ return pool; };

Upvotes: 2

CertainPerformance
CertainPerformance

Reputation: 370929

Because the needed data is gotten asynchronously, there's no way around making everything that depends on it (somehow) asynchronous as well. With asynchronicity involved, one possibility is to usually export functions that can be called on demand, rather than exporting objects:

  • an object that depends on the asynchronous data can't be meaningfully exported before the data comes back
  • if you export functions rather than objects, you can ensure that control flow starts from your single entry point and heads downstream, rather than every module initializing itself at once (which can be problematic when some modules depend on others to be initialized properly, as you're seeing)

On another note, note that if you have a single Promise that needs to resolve, it's probably easier to call .then on it than use an async function. For example, rather than

const promise = require('../secrets');

(async () => {
  // try/catch is needed to handle rejected promises when using await:
  try {
    const secrets = await promise;
    // use secrets here
  } catch(e) {
    // handle errors
  }
})();

you might consider:

const promise = require('../secrets');

promise
  .then((secrets) => {
    // use secrets here
  })
  .catch((err) => {
    // handle errors
  });

It's less wordy and probably easier to make sense of at a glance - better than a self-invoking async IIFE. IMO, the place to use await is when you have multiple Promises that need to resolve, and chaining .thens and returned Promises together gets too ugly.

A module that depends on secrets to perform has to, in its code, have something that effectively waits for secrets to be populated. Although being able to use your const secrets = require('../secrets'); in your lower code example would be nice, it just isn't possible like that. You can export a function that takes secrets as a parameter rather than as a require, and then (synchronously!) return the instantiated pool:

// note, secrets is *not* imported
function makePool(secrets) {
  const pool = new Pool({
    user: secrets.dbUsername,
    host: secrets.dbHost,
    database: secrets.dbDatabase,
    password: secrets.dbPassword,
    port: secrets.dbPort
  });

  pool.on('error', err => {
    console.error('Unexpected error on idle client', err);
    process.exit(-1);
  });
  return pool;
}

module.exports = makePool;

Then, to use it in another module, once the secrets are created, call makePool with the secrets, and then use / pass around the returned pool:

const secretsProm = require('../secrets');
const makePool = require('./makePool');
secretsProm.then((secrets) => {
  const pool = makePool(secrets);
  doSomethingWithPool(pool);
})
.catch((err) => {
  // handle errors
});

Note that the doSomethingWithPool function can be completely synchronous, as is makePool - the asynchronous nature of secrets, once handled with .then in one module, does not have to be dealt with asynchronously anywhere else, as long as other modules export functions, rather than objects.

Upvotes: 4

Related Questions