Sundar
Sundar

Reputation: 387

MongoDB MapReduce: Not working as expected for more than 1000 records

I wrote a mapreduce function where the records are emitted in the following format

{userid:<xyz>, {event:adduser, count:1}}
{userid:<xyz>, {event:login, count:1}}
{userid:<xyz>, {event:login, count:1}}
{userid:<abc>, {event:adduser, count:1}}

where userid is the key and the remaining are the value for that key. After the MapReduce function, I want to get the result in following format

{userid:<xyz>,{events: [{adduser:1},{login:2}], allEventCount:3}}

To acheive this I wrote the following reduce function I know this can be achieved by group by.. both in aggregation framework and mapreduce, but we require a similar functionality for a complex scenario. So, I am taking this approach.

var reducefn = function(key,values){
var result = {allEventCount:0, events:[]};
values.forEach(function(value){
    var notfound=true;
    for(var n = 0; n < result.events.length; n++){
        eventObj = result.events[n];
        for(ev in eventObj){
            if(ev==value.event){
                result.events[n][ev] += value.allEventCount;
                notfound=false;
                break;
            }
        }
    }
    if(notfound==true){ 
        var newEvent={}
        newEvent[value.event]=1; 
        result.events.push(newEvent);
    }
    result.allEventCount += value.allEventCount;
});
return result;

}

This runs perfectly, when I run for 1000 records, when there are 3k or 10k records, the result I get is something like this

{ "_id" : {...}, "value" :{"allEventCount" :30, "events" :[ { "undefined" : 1},
{"adduser" : 1 }, {"remove" : 3 }, {"training" : 1 }, {"adminlogin" : 1 }, 
{"downgrade" : 2 } ]} }

Not able to understand where this undefined came from and also the sum of the individual events is less than allEventCount. All the docs in the collection has non-empty field event so there is no chance of undefined.

Mongo DB version -- 2.2.1 Environment -- Local machine, no sharding.

In the reduce function, why should this operation fail result.events[n][ev] += value.allEventCount; when the similar operation result.allEventCount += value.allEventCount; passes?

The corrected answer as suggested by johnyHK

Reduce function:

    var reducefn = function(key,values){
    var result = {totEvents:0, event:[]};
    values.forEach(function(value){
        value.event.forEach(function(eventElem){
            var notfound=true;
            for(var n = 0; n < result.event.length; n++){
                eventObj = result.event[n];
                for(ev in eventObj){
                for(evv in eventElem){
                    if(ev==evv){
                        result.event[n][ev] += eventElem[evv];
                        notfound=false;
                        break;
                    }
                }}
            }
            if(notfound==true){ 
                result.event.push(eventElem);
            }
        });
        result.totEvents += value.totEvents;
    });
    return result;
}

Upvotes: 3

Views: 847

Answers (1)

JohnnyHK
JohnnyHK

Reputation: 311865

The shape of the object you emit from your map function must be the same as the object returned from your reduce function, as the results of a reduce can get fed back into reduce when processing large numbers of docs (like in this case).

So you need to change your emit to emit docs like this:

{userid:<xyz>, {events:[{adduser: 1}], allEventCount:1}}
{userid:<xyz>, {events:[{login: 1}], allEventCount:1}}

and then update your reduce function accordingly.

Upvotes: 2

Related Questions