Seb
Seb

Reputation: 1113

Sequelize / mysql using 100% CPU when create or update data

I need to update or create data in a mysql table from a large array (few 1000s objects) with sequelize.

When I run the following code it uses up almost all my cpu power of my db server (vserver 2gb ram / 2cpu) and clogs my app for a few minutes until it's done.

Is there a better way to do this with sequelize? Can this be done in the background somehow or as a bulk operation so it doesn't effect my apps performance?

data.forEach(function(item) {           

    var query = {
        'itemId': item.id,
        'networkId': item.networkId
    };

    db.model.findOne({
        where: query
    }).then(function(storedItem) {
        try {
            if(!!storedItem) {
                storedItem.update(item);
            } else  {
                db.model.create(item);
            }
        } catch(e) {
            console.log(e);
        }
    });

});

Upvotes: 1

Views: 1412

Answers (2)

Seb
Seb

Reputation: 1113

Weeks later I found the actual reason for this. (And unfortunately using async didn't really help after all) It was as simple as stupid: I didn't have an MYSQL index for itemId so with every iteration the whole table was queried which caused the high CPU load (obviously).

Upvotes: 1

O. Jones
O. Jones

Reputation: 108736

Your first line of your sample code data.forEach()... makes a whole mess of calls to your function(item){}. Your code in that function fires off, in turn, a whole mess of asynchronously completing operations.

Try using the async package https://caolan.github.io/async/docs.htm and doing this

async = require('async');
...
async.mapSeries(data, function(item){...

It should allow each iteration of your function (which iterates once per item in your data array) to complete before starting the next one. Paradoxically enough, doing them one at a time will probably make them finish faster. It will certainly avoid soaking up your resources.

Upvotes: 1

Related Questions