Reputation: 4855
this question is related with an answer to my previous question. there @robertklep recommends me to use mapLimit()
instead of .map()
because .map()
can't handle a large series of data, and with that solution all works fine. But now I restructured my code, and now neither of the .<fn>Limit()
functions run after the first loop iteration. do I missing something here?
var proccesBook = function(file, cb) {
testFile(file, function (epub) {
if (epub) {
getEpuData(file, function (data) {
insertBookInDB(data)
})
}else{
cb(file)
}
})
}
async.mapLimit(full_files_path, 10, proccesBook, function(err){
if(err){
console.log('Corrupted file', err);
} else {
console.log('Processing complete');
};
})
// ---> only runs for the first 10 series data
Upvotes: 1
Views: 563
Reputation: 146014
Your primary issue is you don't call cb
in the success branch of processBook
. Your control flow must guarantee to call the callback exactly once for each worker function invocation.
Other asides:
eachLimit
is fine
mapLimit
if you need the results of each workercb(file)
as that will be interpretted as an error and about the remaining processing.var proccesBook = function(file, cb) {
testFile(file, function (epub) {
if (epub) {
getEpuData(file, function (data) {
insertBookInDB(data)
cb() // This is what you were missing
})
}else{
cb()
}
})
}
async.eachlimit(full_files_path, 10, proccesBook, function(err){
if(err){
console.log('Corrupted file', err);
} else {
console.log('Processing complete');
};
})
Upvotes: 3