Reputation: 26340
I have a logic app that (among other, not relevant things) calls API A to fetch data (as a JSON array), and sends that data to API B.
B handles data uploads in batches that are smaller than the size of the data set that A returns, so in order to submit them, I must chunk up the data from A into smaller arrays and submit multiple batches to B.
I have not found any Logic App actions that look like they would do the job, but it is quite possible that I missed something. There is the foreach
action, which is probably what I want to use after the data is chunked up, but I definitely do not want to be submitting these one-by-one.
Preliminarily, I think I can get the job done with a custom JS action: grab the results from the call to A, chunk up the array, and return an array of arrays, like so:
const data = workflowContext.actions.API_A_ACTION.outputs.body;
const dataLength = data.length;
const batchedData = [];
const batchSize = 1000;
for (let i = 0; i < dataLength; i += batchSize) {
const batch = data.slice(i, i + batchSize);
batchedData.push(batch);
}
return batchedData;
The the foreach
action could grab the results from the javascript action, and submit the data in batches.
If there is a better way to do this, though, I'd like to know.
Upvotes: 1
Views: 946
Reputation: 227
There is also a string function called "chunk" in logic apps. This can be used to split upp arrays in smaller parts. If you combine this with a foreach loop you can process a big array bit by bit. If you want to merge the output from each loop, use the "append to variable" function.
Upvotes: 0
Reputation: 3111
Doing it with inline code would result in the smallest number of actions, therefore faster/cheaper. An alternative would be to configure workflow B with the batch
trigger with batch release criteria to be the size of the smaller array you need. Then, in workflow A, send the data to B with the send to batch
action.
Upvotes: 1