Reputation: 10912
I have an object that contains an array of objects.
obj = {};
obj.arr = new Array();
obj.arr.push({place:"here",name:"stuff"});
obj.arr.push({place:"there",name:"morestuff"});
obj.arr.push({place:"there",name:"morestuff"});
I'm wondering what is the best method to remove duplicate objects from an array. So for example, obj.arr
would become...
{place:"here",name:"stuff"},
{place:"there",name:"morestuff"}
Upvotes: 953
Views: 1306117
Reputation: 8087
This Map
-based implementation is stable (it keeps the original input order), performant (the number of steps required grows as O(n) with the input size n), and flexible (the comparison key can be specified as well as whether the first or last element with the same key should be returned):
const removeDuplicates = <T,>(
elements: Iterable<T>,
getKey: (element: T) => unknown = element => element,
keepLast: boolean = false
): Iterable<T> => {
const uniqueEntries = new Map();
for (const element of elements) {
const key = getKey(element);
if (!uniqueEntries.has(key) || keepLast) {
uniqueEntries.set(key, element);
}
}
return uniqueEntries.values();
};
Example:
> [...removeDuplicates([1, 1, 2])]
[1, 2]
> const array = [{id: 1, x: 'foo'}, {id: 1, x: 'bar'}, {id: 2, x: 'baz'}]
> [...removeDuplicates(array, element => element.id)]
[{ id: 1, x: 'foo' }, { id: 2, x: 'baz' }]
> [...removeDuplicates(array, element => element.id, true)]
[{ id: 1, x: 'bar' }, { id: 2, x: 'baz' }]
Upvotes: 1
Reputation: 22354
filter()
(Preserves order)If you have some identifier in the objects which signifies uniqueness (e.g. id
), then we can use filter()
with findIndex()
to work through the list and verify that the index of each object with that id
value matches only itself. This means that there's only one such object in the list, i.e. no duplicates.
myArr.filter((obj1, i, arr) =>
arr.findIndex(obj2 => (obj2.id === obj1.id)) === i
)
(Note that this solution keeps the first instance of detected duplicates in the result. You can instead take the last instance by replacing findIndex
with findLastIndex
in the above.)
If the order is not important, then map solutions will be faster: Solution with map
The above format can be applied to other cases by altering how we check for duplicates (i.e. replacing obj2.id === obj1.id
with something else).
place
and name
, as in the question)myArr.filter((obj1, i, arr) =>
arr.findIndex(obj2 =>
['place', 'name'].every(key => obj2[key] === obj1[key])
) === i
)
myArr.filter((obj1, i, arr) =>
arr.findIndex(obj2 =>
JSON.stringify(obj2) === JSON.stringify(obj1)
) === i
)
Caveats:
JSON.stringify()
key order is generally consistent, but is only guaranteed in ES2015 and later
Upvotes: 538
Reputation: 21491
Simple and performant solution with a better runtime than the 70+ answers that already exist:
const ids = arr.map(({ id }) => id);
const filtered = arr.filter(({ id }, index) => !ids.includes(id, index + 1));
const arr = [{
id: 1,
name: 'one'
}, {
id: 2,
name: 'two'
}, {
id: 1,
name: 'one'
}];
const ids = arr.map(({ id }) => id);
const filtered = arr.filter(({ id }, index) => !ids.includes(id, index + 1));
console.log(filtered);
Array.filter()
removes all duplicate objects by checking if the previously mapped id-array includes the current id ({id}
destructs the object into only its id). To only filter out actual duplicates, it is using Array.includes()
's second parameter fromIndex
with index + 1
which will ignore the current object and all previous.
Since every iteration of the filter
callback method will only search the array beginning at the current index + 1, this also dramatically reduces the runtime because only objects not previously filtered get checked.
id
?Just create a temporary one:
const objToId = ({ name, city, birthyear }) => `${name}-${city}-${birthyear}`;
const ids = arr.map(objToId);
const filtered = arr.filter((item, index) => !ids.includes(objToId(item), index + 1));
Upvotes: 264
Reputation: 10828
How about with some ES6 magic?
obj.arr = obj.arr.filter((value, index, self) =>
index === self.findIndex((t) => (
t.place === value.place && t.name === value.name
))
)
A more generic solution would be:
const uniqueArray = obj.arr.filter((value, index) => {
const _value = JSON.stringify(value);
return index === obj.arr.findIndex(obj => {
return JSON.stringify(obj) === _value;
});
});
Using the above property strategy instead of JSON.stringify
:
const isPropValuesEqual = (subject, target, propNames) =>
propNames.every(propName => subject[propName] === target[propName]);
const getUniqueItemsByProperties = (items, propNames) =>
items.filter((item, index, array) =>
index === array.findIndex(foundItem => isPropValuesEqual(foundItem, item, propNames))
);
You can add a wrapper if you want the propNames
property to be either an array or a value:
const getUniqueItemsByProperties = (items, propNames) => {
const propNamesArray = Array.from(propNames);
return items.filter((item, index, array) =>
index === array.findIndex(foundItem => isPropValuesEqual(foundItem, item, propNamesArray))
);
};
allowing both getUniqueItemsByProperties('a')
and getUniqueItemsByProperties(['a']);
Explanation
Upvotes: 984
Reputation: 2814
work for me
const uniqueArray = products.filter( (value,index) => {
return index === products.findIndex( (obj) => {
return JSON.stringify(obj) === JSON.stringify(value);
})
})
Upvotes: 1
Reputation: 327
If array contains objects, then you can use this to remove duplicate
const persons= [
{ id: 1, name: 'John',phone:'23' },
{ id: 2, name: 'Jane',phone:'23'},
{ id: 1, name: 'Johnny',phone:'56' },
{ id: 4, name: 'Alice',phone:'67' },
];
const unique = [...new Map(persons.map((m) => [m.id, m])).values()];
if remove duplicates on the basis of phone, just replace m.id with m.phone
const unique = [...new Map(persons.map((m) => [m.phone, m])).values()];
Upvotes: 13
Reputation: 3214
TypeScript function to filter an array to its unique elements where uniqueness is decided by the given predicate function:
function uniqueByPredicate<T>(arr: T[], predicate: (a: T, b: T) => boolean): T[] {
return arr.filter((v1, i, a) => a.findIndex(v2 => predicate(v1, v2)) === i);
}
Without typings:
function uniqueByPredicate(arr, predicate) {
return l.filter((v1, i, a) => a.findIndex(v2 => predicate(v1, v2)) === i);
}
Upvotes: 0
Reputation: 13409
Using ES6+ in a single line you can get a unique list of objects by key:
const key = 'place';
const unique = [...new Map(arr.map(item => [item[key], item])).values()]
It can be put into a function:
function getUniqueListBy(arr, key) {
return [...new Map(arr.map(item => [item[key], item])).values()]
}
Here is a working example:
const arr = [
{place: "here", name: "x", other: "other stuff1" },
{place: "there", name: "x", other: "other stuff2" },
{place: "here", name: "y", other: "other stuff4" },
{place: "here", name: "z", other: "other stuff5" }
]
function getUniqueListBy(arr, key) {
return [...new Map(arr.map(item => [item[key], item])).values()]
}
const arr1 = getUniqueListBy(arr, 'place')
console.log("Unique by place")
console.log(JSON.stringify(arr1))
console.log("\nUnique by name")
const arr2 = getUniqueListBy(arr, 'name')
console.log(JSON.stringify(arr2))
First the array is remapped in a way that it can be used as an input for a Map.
arr.map(item => [item[key], item]);
which means each item of the array will be transformed in another array with 2 elements; the selected key as first element and the entire initial item as second element, this is called an entry (ex. array entries, map entries). And here is the official doc with an example showing how to add array entries in Map constructor.
Example when key is place:
[["here", {place: "here", name: "x", other: "other stuff1" }], ...]
Secondly, we pass this modified array to the Map constructor and here is the magic happening. Map will eliminate the duplicate keys values, keeping only last inserted value of the same key. Note: Map keeps the order of insertion. (check difference between Map and object)
new Map(entry array just mapped above)
Third we use the map values to retrieve the original items, but this time without duplicates.
new Map(mappedArr).values()
And last one is to add those values into a fresh new array so that it can look as the initial structure and return that:
return [...new Map(mappedArr).values()]
Upvotes: 326
Reputation: 2408
That's my solution by adding the actual array into key value object where the key is going to be the unique identify and the value could be any property of the object or the whole object.
Explanation: The main array with duplicate items will be transformed to a key/value object If the Id already exist in the unique object the value will be override. At the end just convert the unique object into an array.
getUniqueItems(array) {
const unique = {};
// here we are assigning item.name but it could be a complete object.
array.map(item => unique[item.Id] = item.name);
// here you can transform your array item like {text: unique[key], value: key} but actually you can do what ever you want
return Object.keys(unique).map(key => ({text: unique[key], value: key}));
})
);
}
Upvotes: 0
Reputation: 5698
To remove all duplicates from an array of objects, the simplest way is use filter
:
var uniq = {};
var arr = [{"id":"1"},{"id":"1"},{"id":"2"}];
var arrFiltered = arr.filter(obj => !uniq[obj.id] && (uniq[obj.id] = true));
console.log('arrFiltered', arrFiltered);
Upvotes: 47
Reputation: 4530
Considering lodash.uniqWith
const objects = [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }, { 'x': 1, 'y': 2 }];
_.uniqWith(objects, _.isEqual);
// => [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }]
Upvotes: 17
Reputation: 155
You can use for loop and condition to make it unique
const data = [
{ id: 1 },
{ id: 2 },
{ id: 3 },
{ id: 4 },
{ id: 5 },
{ id: 6 },
{ id: 6 },
{ id: 6 },
{ id: 7 },
{ id: 8 },
{ id: 8 },
{ id: 8 },
{ id: 8 }
];
const filtered= []
for(let i=0; i<data.length; i++ ){
let isHasNotEqual = true
for(let j=0; j<filtered.length; j++ ){
if (filtered[j].id===data[i].id){
isHasNotEqual=false
}
}
if (isHasNotEqual){
filtered.push(data[i])
}
}
console.log(filtered);
/*
output
[ { id: 1 },
{ id: 2 },
{ id: 3 },
{ id: 4 },
{ id: 5 },
{ id: 6 },
{ id: 7 },
{ id: 8 } ]
*/
Upvotes: 0
Reputation: 89
We can leverage Javascript's Set object and Array's Filter function: For example:
// Example Array
const arr = [{ id: '1' }, { id: '2' }, { id: '1' }];
// Gather Unique Element Id's based on which you want to filter the elements.
const uniqIds = arr.reduce((ids, el) => ids.add(el.id), new Set());
// Filter out uniq elements.
const uniqElements = arr.filter((el) => uniqIds.delete(el.id));
console.log(uniqElements);
Upvotes: 0
Reputation: 386786
This is a single loop approach with a Set
and some closures to prevent using declared variables outside function declarations and to get a short appearance.
const
array = [{ place: "here", name: "stuff", n: 1 }, { place: "there", name: "morestuff", n: 2 }, { place: "there", name: "morestuff", n: 3 }],
keys = ['place', 'name'],
unique = array.filter(
(s => o => (v => !s.has(v) && s.add(v))(keys.map(k => o[k]).join('|')))
(new Set)
);
console.log(unique);
.as-console-wrapper { max-height: 100% !important; top: 0; }
Upvotes: 1
Reputation: 22354
One liners with Map ( High performance, Does not preserve order )
Find unique id
's in array arr
.
const arrUniq = [...new Map(arr.map(v => [v.id, v])).values()]
If the order is important check out the solution with filter: Solution with filter
Unique by multiple properties ( place
and name
) in array arr
const arrUniq = [...new Map(arr.map(v => [JSON.stringify([v.place,v.name]), v])).values()]
Unique by all properties in array arr
const arrUniq = [...new Map(arr.map(v => [JSON.stringify(v), v])).values()]
Keep the first occurrence in array arr
const arrUniq = [...new Map(arr.slice().reverse().map(v => [v.id, v])).values()].reverse()
Upvotes: 40
Reputation: 5125
ES6 one liner is here
let arr = [
{id:1,name:"sravan ganji"},
{id:2,name:"pinky"},
{id:4,name:"mammu"},
{id:3,name:"avy"},
{id:3,name:"rashni"},
];
console.log(Object.values(arr.reduce((acc,cur)=>Object.assign(acc,{[cur.id]:cur}),{})))
Upvotes: 62
Reputation: 25279
A primitive method would be:
const obj = {};
for (let i = 0, len = things.thing.length; i < len; i++) {
obj[things.thing[i]['place']] = things.thing[i];
}
things.thing = new Array();
for (const key in obj) {
things.thing.push(obj[key]);
}
Upvotes: 202
Reputation: 10957
export const uniqueBy = <T>( uniqueKey: keyof T, objects: T[]): T[] => {
const ids = objects.map(object => object[uniqueKey]);
return objects.filter((object, index) => !ids.includes(object[uniqueKey], index + 1));
}
Upvotes: 11
Reputation: 399
You could use Set
along with Filter
method to accomplish this,
var arrObj = [{
a: 1,
b: 2
}, {
a: 1,
b: 1
}, {
a: 1,
b: 2
}];
var duplicateRemover = new Set();
var distinctArrObj = arrObj.filter((obj) => {
if (duplicateRemover.has(JSON.stringify(obj))) return false;
duplicateRemover.add(JSON.stringify(obj));
return true;
});
console.log(distinctArrObj);
Set
is a unique collection of primitive types, thus, won't work directly on objects, however JSON.stringify
will convert it into a primitive type ie. String
thus, we can filter.
If you want to remove duplicates based on only some particular key, for eg. key
, you could replace JSON.stringify(obj)
with obj.key
Upvotes: 2
Reputation: 1024
Here I found a simple solution for removing duplicates from an array of objects using reduce method. I am filtering elements based on the position key of an object
const med = [
{name: 'name1', position: 'left'},
{name: 'name2', position: 'right'},
{name: 'name3', position: 'left'},
{name: 'name4', position: 'right'},
{name: 'name5', position: 'left'},
{name: 'name6', position: 'left1'}
]
const arr = [];
med.reduce((acc, curr) => {
if(acc.indexOf(curr.position) === -1) {
acc.push(curr.position);
arr.push(curr);
}
return acc;
}, [])
console.log(arr)
Upvotes: 3
Reputation: 1
Removing Duplicates From Array Of Objects in react js (Working perfectly)
let optionList = [];
var dataArr = this.state.itemArray.map(item => {
return [item.name, item]
});
var maparr = new Map(dataArr);
var results = [...maparr.values()];
if (results.length > 0) {
results.map(data => {
if (data.lead_owner !== null) {
optionList.push({ label: data.name, value:
data.name });
}
return true;
});
}
console.log(optionList)
Upvotes: 1
Reputation: 17612
I know there is a ton of answers in this question already, but bear with me...
Some of the objects in your array may have additional properties that you are not interested in, or you simply want to find the unique objects considering only a subset of the properties.
Consider the array below. Say you want to find the unique objects in this array considering only propOne
and propTwo
, and ignore any other properties that may be there.
The expected result should include only the first and last objects. So here goes the code:
const array = [{
propOne: 'a',
propTwo: 'b',
propThree: 'I have no part in this...'
},
{
propOne: 'a',
propTwo: 'b',
someOtherProperty: 'no one cares about this...'
},
{
propOne: 'x',
propTwo: 'y',
yetAnotherJunk: 'I am valueless really',
noOneHasThis: 'I have something no one has'
}];
const uniques = [...new Set(
array.map(x => JSON.stringify(((o) => ({
propOne: o.propOne,
propTwo: o.propTwo
}))(x))))
].map(JSON.parse);
console.log(uniques);
Upvotes: 6
Reputation: 107
This solution worked best for me , by utilising Array.from Method, And also its shorter and readable.
let person = [
{name: "john"},
{name: "jane"},
{name: "imelda"},
{name: "john"},
{name: "jane"}
];
const data = Array.from(new Set(person.map(JSON.stringify))).map(JSON.parse);
console.log(data);
Upvotes: 5
Reputation: 424
const objectsMap = new Map();
const placesName = [
{ place: "here", name: "stuff" },
{ place: "there", name: "morestuff" },
{ place: "there", name: "morestuff" },
];
placesName.forEach((object) => {
objectsMap.set(object.place, object);
});
console.log(objectsMap);
Upvotes: 2
Reputation: 355
es6 magic in one line... readable at that!
// returns the union of two arrays where duplicate objects with the same 'prop' are removed
const removeDuplicatesWith = (a, b, prop) => {
a.filter(x => !b.find(y => x[prop] === y[prop]));
};
Upvotes: 2
Reputation: 27559
I think the best approach is using reduce and Map object. This is a single line solution.
const data = [
{id: 1, name: 'David'},
{id: 2, name: 'Mark'},
{id: 2, name: 'Lora'},
{id: 4, name: 'Tyler'},
{id: 4, name: 'Donald'},
{id: 5, name: 'Adrian'},
{id: 6, name: 'Michael'}
]
const uniqueData = [...data.reduce((map, obj) => map.set(obj.id, obj), new Map()).values()];
console.log(uniqueData)
/*
in `map.set(obj.id, obj)`
'obj.id' is key. (don't worry. we'll get only values using the .values() method)
'obj' is whole object.
*/
Upvotes: 23
Reputation: 21
If you are using Lodash library you can use the below function as well. It should remove duplicate objects.
var objects = [{ 'x': 1, 'y': 2 }, { 'x': 2, 'y': 1 }, { 'x': 1, 'y': 2 }];
_.uniqWith(objects, _.isEqual);
Upvotes: 0
Reputation: 1527
const things = [
{place:"here",name:"stuff"},
{place:"there",name:"morestuff"},
{place:"there",name:"morestuff"}
];
const filteredArr = things.reduce((thing, current) => {
const x = thing.find(item => item.place === current.place);
if (!x) {
return thing.concat([current]);
} else {
return thing;
}
}, []);
console.log(filteredArr)
Solution Via Set
Object | According to the data type
const seen = new Set();
const things = [
{place:"here",name:"stuff"},
{place:"there",name:"morestuff"},
{place:"there",name:"morestuff"}
];
const filteredArr = things.filter(el => {
const duplicate = seen.has(el.place);
seen.add(el.place);
return !duplicate;
});
console.log(filteredArr)
Set
Object Feature
Each value in the Set Object has to be unique, the value equality will be checked
The Purpose of Set object storing unique values according to the Data type , whether primitive values or object references.it has very useful four Instance methods add
, clear
, has
& delete
.
Unique & data Type feature:..
add
method
it's push unique data into collection by default also preserve data type .. that means it prevent to push duplicate item into collection also it will check data type by default...
has
method
sometime needs to check data item exist into the collection and . it's handy method for the collection to cheek unique id or item and data type..
delete
method
it will remove specific item from the collection by identifying data type..
clear
method
it will remove all collection items from one specific variable and set as empty object
Set
object has also Iteration methods & more feature..
Better Read from Here : Set - JavaScript | MDN
Upvotes: 13
Reputation: 11460
The problem can be simplified to removing duplicates from the thing
array.
You can implement a faster O(n) solution (assuming native key lookup is negligible) by using an object to both maintain unique criteria as keys and storing associated values.
Basically, the idea is to store all objects by their unique key, so that duplicates overwrite themselves:
const thing = [{ place: "here", name:"stuff" }, { place: "there", name:"morestuff" }, { place: "there", name:"morestuff" } ]
const uniques = {}
for (const t of thing) {
const key = t.place + '$' + t.name // Or whatever string criteria you want, which can be generified as Object.keys(t).join("$")
uniques[key] = t // Last duplicate wins
}
const uniqueThing = Object.values(uniques)
console.log(uniqueThing)
Upvotes: 2
Reputation: 107
Here is a solution for ES6 where you only want to keep the last item. This solution is functional and Airbnb style compliant.
const things = {
thing: [
{ place: 'here', name: 'stuff' },
{ place: 'there', name: 'morestuff1' },
{ place: 'there', name: 'morestuff2' },
],
};
const removeDuplicates = (array, key) => {
return array.reduce((arr, item) => {
const removed = arr.filter(i => i[key] !== item[key]);
return [...removed, item];
}, []);
};
console.log(removeDuplicates(things.thing, 'place'));
// > [{ place: 'here', name: 'stuff' }, { place: 'there', name: 'morestuff2' }]
Upvotes: 9