Reputation: 164
Here is the aggregation query which fails unexpectedly:
db.sq_lesson_user_lessons.aggregate([
{
"$match": {
lesson_id: {
"$in": [ObjectId("5bb6ec0a178353bbdecdd94d"), ObjectId("5bbf1e611783538013ce2f0a")]
},
status: { "$in": ['featured','started','pending','completed'] }
}
},
{
"$project": {
_id: 1,
user_profile_id: 1,
status: 1,
lesson_id: 1
}
},
{
"$out": "analytics_company_5bb6039598f17297c964fc54_sq_user_lessons"
}
])
assert: command failed: {
"operationTime" : Timestamp(1542715086, 67659),
"ok" : 0,
"errmsg" : "insert for $out failed: { lastOp: { ts: Timestamp(1542715086, 67657), t: 39 }, connectionId: 242551, err: \"E11000 duplicate key error collection: api_smartquest_co_production.tmp.agg_out.637145 index: _id_ dup key: { : ObjectId('5bf22e554b8a982ada5e2828') }\", code: 11000, codeName: \"DuplicateKey\", n: 0, ok: 1.0, operationTime: Timestamp(1542715086, 67657), $clusterTime: { clusterTime: Timestamp(1542715086, 67658), signature: { hash: BinData(0, 0000000000000000000000000000000000000000), keyId: 0 } } }",
"code" : 16996,
"codeName" : "Location16996",
"$clusterTime" : {
"clusterTime" : Timestamp(1542715086, 67659),
"signature" : {
"hash" : BinData(0,"wvZz15/714/PHqAWywLpZlP4azQ="),
"keyId" : NumberLong("6606442824109916161")
}
}
} : aggregate failed
This aggregation results in about 300 thousand records. Sometimes the aggreagation works and sometimes it fails.
Upvotes: 2
Views: 1081
Reputation: 164
This was answered by Daniel Hatcher from MongoDB with the below comment:
CRUX of the answer is:
As the aggregation is searching through the large collection to return results, it is possible that some documents are being returned multiple times. This is related to one of the concepts within MongoDB's read isolation.
Upvotes: 3