Reputation: 1655
I have a function that takes in 2 parameters: an array csvFields:
["id","name","age",...]
and another array of arrays csvRows which contains the data for those fields:
[["1", "john", "10"],["2", "Jane", "11"],["3", "John Doe", "12"],...]
What this function outputs is an array of objects like this:
[{id: "1", name: "john", age: "10"},{id: 2", name: "jane", age: "11"},{id: "3", name: "John doe", age: "12"}, ...]
and here's the function:
const arrayOfArraysToArrayOfObjects = (csvFields, csvRows) => {
const data = csvRows.map((row) => {
let obj = {};
csvFields.forEach((field, index) => {
obj[field] = row[index];
});
return obj;
});
return data;
};
const csvFields = ["id","name","age"];
const csvRows = [["1", "john", "10"],["2", "Jane", "11"],["3", "John Doe", "12"]];
const data = arrayOfArraysToArrayOfObjects(csvFields, csvRows);
console.log(data);
The problem is that this function is not really efficient and gets really slow when it comes to big csv files with alot of rows. Also, the fields can vary for different csv files which is why i have to keep it all dynamic.
Is there any way to make this function more efficient?
Thanks for the help!
Upvotes: 0
Views: 65
Reputation: 8078
Cycles more faster than methods, see benchmark: https://jsperf.com/my-test321321321
const arrayOfArraysToArrayOfObjects = (csvFields, csvRows) => {
let res = []
for (let arr of csvRows) {
let obj = {}
for (let i = 0; i < arr.length; i++) {
obj[csvFields[i]] = arr[i]
}
res.push(obj)
}
return res;
};
const csvFields = ["id","name","age"];
const csvRows = [["1", "john", "10"],["2", "Jane", "11"],["3", "John Doe", "12"]];
const data = arrayOfArraysToArrayOfObjects(csvFields, csvRows);
console.log(data);
Upvotes: 2
Reputation: 386560
You could use Object.assign
and spread single key/value objects.
const
arrayOfArraysToArrayOfObjects = (csvFields, csvRows) =>
csvRows.map(row => Object.assign(...csvFields.map((k, i) => ({ [k]: row[i] })))),
csvFields = ["id", "name", "age"],
csvRows = [["1", "john", "10"], ["2", "Jane", "11"], ["3", "John Doe", "12"]],
data = arrayOfArraysToArrayOfObjects(csvFields, csvRows);
console.log(data);
.as-console-wrapper { max-height: 100% !important; top: 0; }
Upvotes: 1
Reputation: 370679
One possible improvement would be to iterate over the large array (the csvRows
) just once, rather than iterate over the smaller array (csvRows
) once. Also, you might use for
loops instead, which have slightly less overhead (but are less readable) than array methods, and so might be more performant with huge files:
const arrayOfArraysToArrayOfObjects = (csvFields, csvRows) => {
const rowLen = csvRows.length;
const fieldLen = csvFields.length;
const results = [];
for (let i = 0; i < rowLen; i++) {
const row = csvRows[i];
const obj = {};
for (let j = 0; j < fieldLen; j++) {
obj[csvFields[j]] = row[j];
}
results.push(obj);
}
return results;
};
const csvFields = ["id", "name", "age"];
const csvRows = [
["1", "john", "10"],
["2", "Jane", "11"],
["3", "John Doe", "12"]
];
const data = arrayOfArraysToArrayOfObjects(csvFields, csvRows);
console.log(data);
Upvotes: 1