Reputation: 3062
I'm trying to upload over 5,000 comments from a CSV and then insert them into a collection.
I get the following:
all done dfae22fc33f08cde515ac7452729cf4921d63ebe.js:24
insert failed: MongoError: E11000 duplicate key error index: ag5Uriwu.comments.$_id_ dup key: { : "SuvPB3frrkLs8nErv" } dfae22fc33f08cde515ac7452729cf4921d63ebe.js:1
Connection timeout. No DDP heartbeat received.
The script at hand:
'click .importComments': function(e) {
var $self = $(e.target);
$self.text("Importing...");
$("#commentsCSV").parse({
worker: true,
config: {
step: function(row) {
var data = row.data;
for (var key in data) {
var obj = data[key];
post = Posts.findOne({legacyId: obj[1]});
var comment = {
// attributes here
};
Comments.insert(comment);
Posts.update(comment.postId, {
$inc: { commentsCount: 1 },
});
}
$self.text("Import Comments");
},
complete: function(results, file) {
console.log("all done");
}
}
});
}
How can I make this work without blowing up with the connection timeout errors?
Locally it seems to work decently but on production (modulus.io) it ends pretty abruptly.
Upvotes: 1
Views: 745
Reputation: 768
I think the problem here is not to do with DDP but with MongoDB. The DDP connection is timing out due to the MongoDB error.
You're getting a duplicate key error on the _id
field. The _id
field is automatically indexed by MongoDB and it is a unique index so the same value cannot appear twice in the same collection.
The CSV you're uploading likely has its own _id
fields in it meaning Mongo is not generating its own binary fields (which guarantee uniqueness).
So I'd recommend removing the _id
field from the CSV if it exists.
You can also try using the following package: http://atmospherejs.com/package/csv-to-collection
Upvotes: 2