Reputation: 161
I have some documents like this
{ name: 'John', likes: ['reading', 'cars'], dislikes: ['trees', 'cats']},
{ name: 'Ally', likes: ['drawing', 'cars'], dislikes: ['people', 'cats']},
{ name: 'Michelle', likes: [], dislikes: ['trees', 'cats']},
I'd like the aggregation to return all the unique values of each field like this
{
name: ['John', 'Ally', 'Michelle'],
likes: ['reading', 'cars', 'drawing'],
dislikes: ['trees', 'cats', 'people']
}
Note, the name
field is not an array, and the others are, and not all documents have included some of the arrays, like Michelle who has no likes
. I've tried several $group methods with $unwind, which either don't include all the names properly (100 unique names, but only returning a 10) or mess up due to some documents not including a field such as 'likes'. Racking my brain all day on this, thanks for your help.
Upvotes: 2
Views: 1247
Reputation: 151190
One $group
and $addToSet
on all keys:
db.collection.aggregate([
{ "$unwind": { "path": "$likes", "preserveNullAndEmptyArrays": true } },
{ "$unwind": { "path": "$dislikes", "preserveNullAndEmptyArrays": true } },
{ "$group": {
"_id": null,
"name": { "$addToSet": "$name" },
"likes": { "$addToSet": "$likes" },
"dislikes": { "$addToSet": "$dislikes" }
}},
{ "$project": {
"_id": 0,
"name": {
"$filter": {
"input": "$name",
"as": "el",
"cond": { "$ne": [ "$$el", null ] }
}
},
"likes": {
"$filter": {
"input": "$likes",
"as": "el",
"cond": { "$ne": [ "$$el", null ] }
}
},
"dislikes": {
"$filter": {
"input": "$dislikes",
"as": "el",
"cond": { "$ne": [ "$$el", null ] }
}
}
}}
])
Use the $project
and $filter
if you really must to remove null
values;
Upvotes: 2