turkgen turkgen
turkgen turkgen

Reputation: 36

node.js loop crashed immediately when insert bulk data to cassandra

I am trying to insert 1000000 data to cassandra with nodeJS. But the loop is crashed a little time later. Every time I cannot insert over 10000 record. Why the loop is crashed anybody help me.

Thanks.

My code looks like:

var helenus = require('helenus'),
  pool = new helenus.ConnectionPool({
    hosts      : ['localhost:9160'],
    keyspace   : 'twissandra',
    user       : '',
    password   : '',
    timeout    : 3000
  });

pool.on('error', function(err){ 
    console.error(err.name, err.message); 
    }); 
var i=0; 
pool.connect(function(err, keyspace){ 
    if(err){ throw(err); 
    } else { 
        while (i<1000000){ 
            i++; 
            var str="tkg" + i; 
            var pass="ktr" + i; 
            pool.cql("insert into users (username,password) VALUES (?,?)",[str, pass],function(err, results){
            }); 
            } 
            } 
        });
    console.log("end"); 

Upvotes: 0

Views: 494

Answers (2)

turkgen turkgen
turkgen turkgen

Reputation: 36

Actually there was no problem. I checked the number of records twice at different times and i saw that the write operation continued until timeout value. The timeout value is given inside the code. As a summary in the code there is no crash, thank you Julian H. Lam for reply.

But another question is that how to increase write performance of cassandra? What should i change in cassandra.yaml file or any?

Thank you.

Upvotes: 0

Julian H. Lam
Julian H. Lam

Reputation: 26124

You're probably overloading the Cassandra queue by attempting to make a million requests all at once! Keep in mind the request is asynchronous, so it is made even if the previous one has not completed.

Try using async.eachLimit to limit it to 50-100 requests at a time. The actual maximum concurrent capacity changes based on the backend process.

Upvotes: 2

Related Questions