Reputation: 4833
I'm working with a really big array in JS and I can see most of the time is used for loading and parsing the json data.
// is a Chrome Extension (but maybe I can move it to a nodejs app)
This is basically how I loading my data:
async function loadData(jsonFiles){
const fullData = [];
for(const jsonFile of jsonFiles){
const localUrl = 'http://localhost/'+jsonFile;
const response = await fetch(jsonFile);
if ( response.ok ){
try{
const data = await response.json();
const L = data.length;
for (let k = 0; k < L; k++) {
fullData.push(data[k]);
}
}
catch(e){
}
}
}
return fullData;
}
Is there any faster way to do that? even if it implies to save the data in another way/format
Upvotes: 0
Views: 368
Reputation: 2753
Make call and transform to json parallel
async function loadData(jsonFiles){
const calls = [];
for(const jsonFile of jsonFiles){
calls.push(fetch(jsonFile).then(response => response.json()));
}
return await Promise.allSettled(calls)
.then(parts => parts.filter(({status}) => status === "fulfilled"))
.then(parts => parts.map(({value}) => value))
.then(parts => parts.flat());
}
Upvotes: 1
Reputation: 1074415
You can do the fetch
calls in parallel, but other than that there's not a lot more you can do:
function loadData(jsonFiles){
return Promise.all(
jsonFiles.map(async file => {
const localUrl = 'http://localhost/'+jsonFile;
const response = await fetch(jsonFile);
if (response.ok) {
try{
return await response.json();
} catch (e) {
return null;
}
} else {
return null;
}
})
).then(results => {
return results.filter(result => result); // Filter out the `null`s
}).then(results => {
return results.flat(); // Flatten the results into one array
});
}
Upvotes: 3