David Makovoz
David Makovoz

Reputation: 1918

Mongodb bulk write error

I'm executing bulk write

bulk = new_packets.initialize_ordered_bulk_op()

bulk.insert(packet)

output = bulk.execute()

and getting an error that I interpret to mean that packet is not a dict. However, I do know that it is a dict. What could be the problem?

Here is the error:

    BulkWriteError                            Traceback (most recent call last)
    <ipython-input-311-93f16dce5714> in <module>()
          2 
          3 bulk.insert(packet)
    ----> 4 output = bulk.execute()

    C:\Users\e306654\AppData\Local\Continuum\Anaconda\lib\site-packages\pymongo\bulk.pyc in execute(self, write_concern)
583         if write_concern and not isinstance(write_concern, dict):
584             raise TypeError('write_concern must be an instance of dict')
    --> 585         return self.__bulk.execute(write_concern)

    C:\Users\e306654\AppData\Local\Continuum\Anaconda\lib\site-packages\pymongo\bulk.pyc in execute(self, write_concern)
429             self.execute_no_results(generator)
430         elif client.max_wire_version > 1:
    --> 431             return self.execute_command(generator, write_concern)
432         else:
433             return self.execute_legacy(generator, write_concern)

    C:\Users\e306654\AppData\Local\Continuum\Anaconda\lib\site-packages\pymongo\bulk.pyc in execute_command(self, generator, write_concern)
296                 full_result['writeErrors'].sort(
297                     key=lambda error: error['index'])
    --> 298             raise BulkWriteError(full_result)
299         return full_result
300 

    BulkWriteError: batch op errors occurred

Upvotes: 28

Views: 61259

Answers (6)

Niharika
Niharika

Reputation: 1

I was trying to insert two documents with the same "_id" and other keys. Solution:

  1. insert different "_id" s for different documents. OR
  2. remove the "_id" and you get a randomized one.

Upvotes: 0

bwl1289
bwl1289

Reputation: 2058

In addition to the above, check your unique indexes. If you're bulk inserting and have specified an index that doesn't exist in your data, you will get this error.

For example, I had accidentally specified name as a unique index, and the data I was inserting had no keys called name. After the first entry is inserted into mongo, it will throw this error because you're technically inserting another document with a unique name of null.

Here's a part of my model definition where I'm declaring a unique index:

self.conn[self.collection_name].create_index(
            [("name", ASCENDING)],
            unique=True,
        )

And here are the details of the error being thrown:

{'writeErrors': [{'index': 1, 'code': 11000, 'keyPattern': {'name': 1},
'keyValue': {'name': None}, 'errmsg': 'E11000 duplicate key error collection:
troposphere.temp index: name_1 dup key: { name: null }'
...

more resources: MongoDB E11000 duplicate key error

Upvotes: 1

Lamak
Lamak

Reputation: 101

Try using debugger, it should gives you errmsg with exact error, and op object was trying to insert.

Upvotes: -1

Samer Aamar
Samer Aamar

Reputation: 1408

It can be many reasons...
the best is that you try...catch... the exception and check in the errors

from pymongo.errors import BulkWriteError
try:
    bulk.execute()
except BulkWriteError as bwe:
    print(bwe.details)
    #you can also take this component and do more analysis
    #werrors = bwe.details['writeErrors']
    raise

Upvotes: 33

Miguel Angel
Miguel Angel

Reputation: 439

You should check 2 things:

  1. Duplicates, if you are defining your own key.
  2. Be able to manage custom types, In my case I was trying to pass a hash type object that was not able to be converted into a valid objectId, and that was leading me to the first point and I felt into a vicious circle (I solve it converting myObject to string.

Inserting one by one will give you the idea what is happening.

Upvotes: 17

David Makovoz
David Makovoz

Reputation: 1918

Ok, the problem was that i was assigning _id explicitly and it turns out that the string was larger than 12-byte limit, my bad.

Upvotes: 20

Related Questions