user3734568
user3734568

Reputation: 1461

Update MongoDB collection by matching _id

I am trying to update mongodb collection from Python using condition on _id

Like if I found match of _id in python dataframe I need to update corresponding document in colllection Below script is working, but it takes time for exeution if there are too many document is there any efficient way to handle this. Please advice

for document in db.AMTicketData.find():
    for index, row in AMTicketData1.iterrows():
        if(row['_id']==a['_id']):
            db.AMTicketData.update_one({'_id': row['_id']},{'$set': {'Application_Name': row['Application_Name']}}, upsert=True)
            break

I have used below bulk operation codes, was able to update collection in bulk

bulk = db.AMTicketData.initialize_unordered_bulk_op()
for index, row in AMTicketData1.iterrows():
    bulk.find({'_id':row['_id']}).update_one({'$set':{'Application_Name':row['Application_Name']}})

bulk.execute()

Upvotes: 2

Views: 306

Answers (1)

Jair Batista
Jair Batista

Reputation: 36

You could try using bulk write. You just have to create an array with all updates and apply it once with collection.bulk_write(list_of_updates)

Something like:

updates = []
for document in db.AMTicketData.find():
   for index, row in AMTicketData1.iterrows():
     if(row['_id']==a['_id']):
      updates.append(UpdateOne({'_id': row['_id']}, {'$set': {'Application_Name': row['Application_Name']}}, upsert=True)
      break

db.AMTicketData.bulk_write(updates)

https://docs.mongodb.com/manual/core/bulk-write-operations/

Upvotes: 2

Related Questions