Reputation: 5725
I have my MongoDB data like that
Please look at the last field - time, as you can see, I have some "duplicate" data which have been marked with color.
For the small database, I can remove the duplicate values with below code
var cursor = db.getCollection("light").aggregate([
{$group : {
"_id": {
index: "$index",
unit: "$unit",
min: "$min",
max: "$max",
node: "$node",
year: { "$year": "$time" },
dayOfYear: { "$dayOfYear": "$time" },
hour: { "$hour": "$time" },
minute: { "$minute": "$time" }
},
_id_not_delete: { $last: "$_id" }
}}
],
{
"allowDiskUse" : true
}
)
var ids_not_delete = cursor.map(function (doc) { return doc._id_not_delete; });
db.getCollection("light").remove({"_id": { "$nin": ids_not_delete }});
But my database has more than 20 millions record, thus I receive this error
E QUERY [js] Error: Converting from JavaScript to BSON failed: Object size 23146644 exceeds limit of 16793600 bytes. :
Bulk/addToOperationsList@src/mongo/shell/bulk_api.js:611:28
Bulk/findOperations.remove@src/mongo/shell/bulk_api.js:743:24
DBCollection.prototype.remove@src/mongo/shell/collection.js:404:13
@(shell):1:1
I know that the root cause is
The maximum BSON document size is 16 megabytes
I think I should change below code, but I don't have any good solution.
var ids_not_delete = cursor.map(function (doc) { return doc._id_not_delete; });
Do you have any ideas to optimize my code?
Example documents in the collection:
{
"_id" : ObjectId("5be22d5808c08300545effee"),
"index" : "LIGHT",
"unit" : "LUX",
"min" : NumberInt(5),
"max" : NumberInt(6),
"avg" : 5.5,
"node" : "TH",
"time" : ISODate("2018-11-07T00:10:00.091+0000")
},
{
"_id" : ObjectId("5be22b0052122e0047c3467c"),
"index" : "LIGHT",
"unit" : "LUX",
"min" : NumberInt(3),
"max" : NumberInt(5),
"avg" : NumberInt(4),
"node" : "TH",
"time" : ISODate("2018-11-07T00:00:00.204+0000")
},
{
"_id" : ObjectId("5be22b0008c08300545eff79"),
"index" : "LIGHT",
"unit" : "LUX",
"min" : NumberInt(3),
"max" : NumberInt(5),
"avg" : NumberInt(4),
"node" : "TH",
"time" : ISODate("2018-11-07T00:00:00.081+0000")
}
MongoDB shell version v4.0.2
MongoDB 4.0.0
Upvotes: 2
Views: 3394
Reputation: 37048
You can invert your aggregation to select ids you want to delete, rather than ones you want to keep:
const toDelete = db.getCollection("light").aggregate([
{ $group : {
"_id": {
index: "$index",
unit: "$unit",
min: "$min",
max: "$max",
node: "$node",
year: { "$year": "$time" },
dayOfYear: { "$dayOfYear": "$time" },
hour: { "$hour": "$time" },
minute: { "$minute": "$time" }
},
ids: {$push: "$_id"}
} },
{$project: {_id: {$slice: ["$ids", 1, 10000]}}},
{$unwind: "$_id"},
{$project: {_id: 0, deleteOne: { "filter" : { "_id" : "$_id"} } } }
]).toArray()
10,000 here is any big enough number significantly greater than expected number of duplicates within a group.
Then you can use bulkWrite:
db.getCollection("light").bulkWrite(toDelete);
The driver will split the array by batches 100,000 deletions each.
Upvotes: 1