Ckappo
Ckappo

Reputation: 597

node.js insert json array object into mysql table

i have a csv that I want to save into my mySQL Table. My parser works well and it also save the created json Array to my table. My problem is that he insert row for row in the background and don't response it.

My code looks like that:

var file= './mytable.csv';
    connection.connect(function (err) {});
var csv = require('csvtojson');

    csv({ delimiter:","})
        .fromFile(file)
        .on('end_parsed', function(jsonArray){
            for(var i = 0; i < jsonArray.length; i++){
                var post  = jsonArray[i]
                conn.query('INSERT INTO mytable SET ?', post, function(err, results) {
                    if (err) throw err;
                    console.log(result.insertId);
                });
            }
            res.end("done"); 
        })
        .on('done', function(error){
            console.log('end')
        })

My Goal is that my api send: its "done" with (res.json("Done")) when the complete query is done. What should I change?

Greetings

edit: my csv is realy large, with almost 500k rows!

EDIT:

I inserted async into my parser like that:

 csv({ delimiter:";"})
    .fromFile(file)
    .on('end_parsed', function(jsonArray) {
        async.forEach(jsonArray, function (jsonArrays, callback) {
            conn.query('INSERT INTO mytable SET ?', jsonArrays, callback);
        }, function (err) {
            if (err) return next(err);
            res.json("done");
            console.log("done")
        });

    });

But he don't responde with "done" (in Terminal he write it, but postman give me only "Could not get any response")

Upvotes: 0

Views: 1643

Answers (1)

Marc
Marc

Reputation: 1976

Your call to res.end()/ res.json() doesn't wait for all inserts to be finished.

And if you start your inserts within a for-loop you start them all more or less in parallel. You should take look at something like the async library (http://caolan.github.io/async). There you find a eachLimit() function that lets you run async operations on a collection/array. With this function you can limit how many operations can run in parallel. And you get a callback that is called when an error happens or all async calls have finished. Within this callback you can call the res.json(...) function to send your response.

Sample:

var async = require('async');

//...

function save_row_to_db (post, callback) {
    conn.query('INSERT INTO mytable SET ?', post, callback);
}

function finished(err) {
    if (err) throw err;
    res.end("done");
}

async.eachLimit(csvRows, 20, save_row_to_db, finished);

Upvotes: 1

Related Questions