Reputation: 1223
So what I have is an array of objects called "siteRows". This is held in the state in my reducer. I have a delete function that "deletes" (sets null) properties of objects in those rows.
E.g. (the '-' sign just means no value to that column, so null)
"1 2 3 4 -" "test me stackoverflow - -" "1 2 3 78 -"
In row 1 & 3 the "1 2 3" part are the same. Image now you remove the "4" from the first row. THen "1 2 3 - -" is a "unique" row in this collection. If I remove "78" from the third row, you'll get the following:
"1 2 3 - -" "test me stackoverflow - -" "1 2 3 - -"
As you can see, row 0 and 2 (indeces) are the same, not unique, so I'd have to only keep 1 row (for example the first you come by) and can remove the rest of the duplicates. If I remove the "78" I would want the following:
"1 2 3 - -" "test me stackoverflow - -"
and that's the entire array.
Now, the code that I have is as follows:
return {
...state,
siteRows: state.siteRows
.map(recurCheck(action.payload?.id))
.filter((row) => keys.some(([key]) => row[key]))
//.splice(state.siteRows.findIndex((row) => row.id))
};
"recurCheck" basically loops all rows and deletes the correct column associated with the correct row. So it's like "i need to delete column 2 but of row 2" then it will loop row 1 ... no nothing, loop 2 ... ah yes column 2! (set value => null), keep going etc...
Basically, on THAT result (so after "map" and "filter"), I'd want to remove the duplicates too. I can't do "distinct" because it's about objects and they have different references, even with the same values, so it would never work.
Does anyone now how I can easily filter out "duplicate" objects in this array as well, if there are any? So keep the first "1 2 3 - -" you come by but delete the rest of the rows (because they aren't unique)?
As you can see I've also tried with "splice", also with adding other conditionals in the "filter", but to no avail.
Upvotes: 1
Views: 1196
Reputation: 3399
If you had a function
const rowsAreEqual = (a, b) => /* returns true if a == b */
then the idea is simple:
function removeDuplicates(array) {
return array
.filter((v, i) =>
array.slice(0, i)
.every(other => !rowsAreEqual(v, other)
)
}
However, this might have efficiency issues for large arrays, as it's O(n^2). If you had a way of serializing every row into a primitive (i.e. number or string), then you can make it O(n):
const serializeRow = (row) => /* ... return unique number or string for that row */
function removeDuplicates(array) {
const previousRows = new Set();
return array
.filter(v => {
const id = serializeRow(v);
if (previousRows.has(id)) {
return false;
}
previousRows.add(id);
return true;
})
}
(Feel free to remove array.filter
and use a plain old loop if you want, this just describes the idea)
If every row is just number | null
then something like
const serializeRow = (row) => row.join(';')
should be enough
Upvotes: 2