MRK
MRK

Reputation: 573

how to handle large array dataset in mongo map reduce function

I have a map/reduce job that builds up an array where the reduce performs array concatenation. Sometimes the resulting document seems to hit the max bson size limit and map reduce fails with "value too large to reduce" error.

please let us know how to handle this situation. i want all the documents in the array. i cannot concatenate in the array since its failing with above error if there are more documents..

Thanks MRK

Upvotes: 1

Views: 329

Answers (1)

Macdiesel
Macdiesel

Reputation: 935

You could use something like the MongoDB Hadoop Adapter to run your map reduce job on a real map/reduce framework.

https://github.com/mongodb/mongo-hadoop/

Upvotes: 3

Related Questions