Reputation: 5131
Using the nodeJS package color-namer
I'm naming an hex code by a color name. (ex: "#FF0000" will be turned to "red")
All my hex codes are stored in a mysql database (table = hex_codes ; colname = color), which I'm querying from nodejs. Line by line I'm then retrieving the name of the hex code (colname = color) and I'm dumping the result in an other table (table = color_names).
My code is the following:
var mysql = require("mysql"),
namer = require('color-namer'),
connection = mysql.createConnection({
multipleStatements: true,
host : '***',
user : '***',
password : '***',
database : '***'
});
connection.connect();
connection.query(
`SELECT color from hex_codes`,
function(err, results, fields) {
results.forEach(function(elem){
var currentHex = elem['color'],
currentColor = namer(currentHex);
connection.query(
`INSERT INTO color_names (hex, basic) VALUES (?,?);`,
[currentHex,currentColor['basic'][0]['name']]
);
});
}
);
Since javascript is an asynchronous language, how come my lines are not dumped one by one but the whole dataframe is dumped once the script is done running?
Upvotes: 0
Views: 79
Reputation: 9549
The operation in your code sample is expected to be sequential. Here is how:
results.forEach(function(elem){
var currentHex = elem['color'],
currentColor = namer(currentHex);
connection.query(
`INSERT INTO color_names (hex, basic) VALUES (?,?);`,
[currentHex,currentColor['basic'][0]['name']]
);
});
Here, the forEach
statement has the job of delegating the query to the connection object, but will not wait for the query to complete. All good.
But then, the connection.query method by default supports only sequential execution, which means, subsequent requests for inserts in your query are executed in sequence and not parallel. The mysqljs documentation says:
The MySQL protocol is sequential, this means that you need multiple connections to execute queries in parallel.
For them to be executed in parallel, you should use connection pooling which is described here.
Upvotes: 0
Reputation: 4876
The call back of the first connection.query is enough to prove its asynch and call backs are only executed when the stack is empty. so after getting the result u are doing an one by one insert but apart from putting a callback to second u will never get its asynch or synch because it will be quick try
connection.query(
`SELECT color from hex_codes`,
function(err, results, fields) {
results.forEach(function(elem){
var currentHex = elem['color'],
currentColor = namer(currentHex);
console.log("in loop");
connection.query(
`INSERT INTO color_names (hex, basic) VALUES (?,?);`,
[currentHex,currentColor['basic'][0]['name']]
,function(err,data){
console.log("insert callback")
});
});
});
The order of execution will say itself that its asynch. And its important to remember that callbacks are only executed when the stack is empty.
Upvotes: 1
Reputation: 956
It will dump lines one by one only in case of streams . Streams will start emitting output as soon as the first item arrives .It is an extension of event emitter pattern .
However if you use callback , your code will be asynchronous but the result will arrive all at once .
Hope this helps
Upvotes: 0