Reputation: 1
My goal is to write the Fields
from the JSON object below into a MySQL Database.
var dataRow = {
Table: "promosbudget",
Fields:[{
ManagerID: "Jose",
UserID: "ife",
Budget: "50000",
Year: "2015"
},
{
ManagerID: "Jose",
UserID: "fgs",
Budget: "50000",
Year: "2015"
},
{
ManagerID : "Jose",
UserID : "brz",
Budget : "50000",
Year : "2015"
}]
};
I'm using this command to receive and write the data:
app.post('/paramsjson', jsonParser, function(req, res) {
conMySQL.query('INSERT INTO ' + req.body.Table + ' SET ?',req.body.Fields,
function(err,result) {
console.log(result);
}
);
});
The issue is that I can only write the first JSON row, the other 2 rows are omitted.
I'd like to ask if there is a recommended method to do that right when I need to export a large JSON object (100.000 row), is it necessary to make a loop and read each row sequentially?
Thanks in advance for your help!
Upvotes: 0
Views: 1001
Reputation: 2204
Try to go step by step. Don't INSERT right into the database, first console.log the outputs, look into the results, try to INSERT directly from the code. Then try to combine it all.
Anyway - I highly recommend using knex for any DB operation.
Take this sample codes for testing:
app.post('/paramsjson', jsonParser, function(req, res) {
console.log(req.body);
var table = req.body.Table;
var fields = req.body.Fields;
//If you want to use knex:
knex( table ).insert( fields )
.then(function (result) {
console.log(result)
})
.catch(function (err) {
console.log(err)
})
});
Upvotes: 1