I have a function that takes in 2 parameters: an array csvFields:
["id","name","age",...]
and another array of arrays csvRows which contains the data for those fields:
[["1", "john", "10"],["2", "Jane", "11"],["3", "John Doe", "12"],...]
What this function outputs is an array of objects like this:
[{id: "1", name: "john", age: "10"},{id: 2", name: "jane", age: "11"},{id: "3", name: "John doe", age: "12"}, ...]
and here's the function:
const arrayOfArraysToArrayOfObjects = (csvFields, csvRows) => {
const data = csvRows.map((row) => {
let obj = {};
csvFields.forEach((field, index) => {
obj[field] = row[index];
});
return obj;
});
return data;
};
const csvFields = ["id","name","age"];
const csvRows = [["1", "john", "10"],["2", "Jane", "11"],["3", "John Doe", "12"]];
const data = arrayOfArraysToArrayOfObjects(csvFields, csvRows);
console.log(data);
The problem is that this function is not really efficient and gets really slow when it comes to big csv files with alot of rows. Also, the fields can vary for different csv files which is why i have to keep it all dynamic.
Is there any way to make this function more efficient?
Thanks for the help!