Ayush Kumar
Ayush Kumar

Reputation: 954

Nodejs Split large array and make multiple API calls

I have a CSV file that contains 21k records(1 word alphanumeric type/line). I need to read these records and send them to an API in JSON key-value pair format for some processing that accepts only 500 elements at a time. I have a solution on my mind but I wanted to know that is there a better or more efficient solution/Algorithm for this?

Algorithm:

  1. Load the CSV into an array
  2. Split this 1D array into N array with fix length of 500 columns(elements)
  3. With each of these N number of 500 element Array, prepare JSON payload and send to API.

Code:

var dataArray = [];

fs.readFile(inputPath, 'utf8', function (err, data) {
    dataArray = data.split(/\r?\n/);  
 })


var temp = [];
for(i=0;i<dataArray.length;){
  temp=[];
 for(j=0;(j<500 && i<dataArray.length);j++){  
    temp.push(data[i]);
    i++;
  }
  // make API call with current values of temp array
  makeCallToAPI(temp);
}

Upvotes: 1

Views: 1412

Answers (1)

danh
danh

Reputation: 62676

I'd use lodash or underscore _.chunk(). Also note that both the fs and API are better handled async.

const _ = require('lodash');

async function callApi(chunk) {
  // return a promise that resolves with the result of the api
}

async function readFS(inputPath) {
  return new Promise((resolve, reject) => {
    fs.readFile(inputPath, 'utf8', function (err, data) {
      if (err) reject(err);
      else resolve(data.split(/\r?\n/));
    });
  });
}

async function doTheWork(inputPath) {
  const data = await readFS(inputPath);
  const chunks = _.chunk(data, 500)
  const promises = chunks.map(callApi)
  return _.flatten(Promise.all(promises));
}

Also note the use of _.flatten(), since the last Promise.all() will resolve to an array of arrays of chunks of promises.

Upvotes: 1

Related Questions