Reputation: 11
The document looks like this,
{
"_id":"ffffff999999999f9ff9f9f9f",
"Name" : "John Doe",
"Array" : [{
"Id1" : "a8ed3d86b8464e0cae4672cef3862860",
"Id2" : "6d7aac14b1e142abafde167d928e3dbc",
"Id3" : "2323232"
}]
}
There is a uniqueness index on the combination of the 3 array elements. So the index looks like this,
"key" : {
"Array.Id1" : 1,
"Array.Id2" : 1,
"Array.Id3" : 1
},
"unique" : true,
My requirement: I would like to find documents based on "_id" value and remove the array element completely.
Problem: I was trying something like this,
db.Collection.update({"_id":"ffffffff52a8a15ce4b05d6a8d40f973"},{$unset:{Array:1}})
When i try to unset the array element, the first update goes through but the consecutive ones fail with the following error,
*E11000 duplicate key error index: int.Collection.$Array.Id1_1_Array.Id2_1_Array.Id3_1 dup key: { : null, : null, : null }*
I am wondering if there is any work around for this issue.
I would have to run this on a huge collection and modifying the indexes is not an option.
Any suggestions would be greatly helpful.
Thanks.
Upvotes: 0
Views: 253
Reputation: 5095
Unique indexes work that way. You cannot have duplicates on the indexed fields even null values(or leave it undefined).
However the sparse
option allows you to achieve this.
Usage: db.collection.ensureIndex( { a: 1 }, { unique: true, sparse: true } )
So if you are going to unset
the array, you might want to re-index with the sparse option.
Warning: Using these indexes will sometimes result in incomplete results
when filtering or sorting results, because sparse indexes are not complete
for all documents in a collection.
Upvotes: 1
Reputation: 534
As demonstrated in this minimal example, it is not possible to have a unique-indexed field unset more than once right from the start - the problem not only occurs when you try to unset it later on.
> db.coll.ensureIndex({indexField:1}, {unique: true})
> db.coll.insert({name: "doc without index field 1"})
> db.coll.insert({name: "doc without index field 2"})
E11000 duplicate key error index: test.coll.$indexField_1 dup key: { : null }
You will have to replace the field with something else - not necessarily an array, but it will have to be unique...
Upvotes: 0