Mukesh Jha
Mukesh Jha

Reputation: 944

How to use async readdir function inside a loop iterating over series of directories?

I am archiving a certain text files from multiple directories. So firstly, I am iterating inside a folder which gives me folderPath and this folderPath can contain many files(txt,pdf,etc.) and there are multiple folder paths, I am using a async readdir for folderPath and appending those individual files in the archiver and then finally closing. If I am doing archive.finalize before the folder loop ends, it isn't generating required number of txt files in the zip just the initial one's which is obvious. And if I keep archive.finalize on line 2 it's throwing me an error as stated below the directory structure. Can someone please help in this regard ?

Directory structure is like:

mainfolder/folder1 mainfolder/folder2

mainfolder/folder1/sometext.txt mainfolder/folder1/someanothertext.txt

mainfolder/folder2/sometext.txt mainfolder/folder2/someanothertext.txt

Now I want to zip it as:

Outuput.zip which contains -> folder1 and folder2 with respective txt files. I was able to achieve it when using sync function to readdir(readdirsync), but with async, I am facing some callback issue.

Error :

ArchiverError: queue closed
    at Archiver.file (C:\Users\Workspace\code\node_modules\archiver\lib\core.js:692:24)
    at C:\Users\Workspace\current_code_snippet
    at Array.forEach (<anonymous>)
    at C:\Users\current_code_snippet_ line 3 as specified in code snippet
    at FSReqCallback.oncomplete (fs.js:156:23) {
  code: 'QUEUECLOSED',
  data: undefined
}

Code :

this.children.forEach((value, _index, _array) => {
    const folderPath = path.join(basePath, value);
    fs.readdir(folderPath, (err, fileNames) => {
          if (err){
              throw err;
          }
          else {
            fileNames.forEach(file => {  // line 3
              if (outputType === ".txt") {
                  const filePath = path.join(basePath, value, file);
                  archive.file(filePath, { name: `${value}/${file}` }); // file is saved as value/file inside parent zip
              }
            })
          }
    })
    archive.finalize(); // line 1
});
archive.finalize(); // line 2

Upvotes: 0

Views: 459

Answers (1)

I would wrap the fs call into a Promise. This makes it possible to just await the operations and it takes some complexity out of the code.

Be aware that forEach loops don't work with async, await

const children = ["path_0", "path_1", "path_2", "path_3"];

// mock fs / async action
const fs = {
    readdir: (path, error, success) => {
        setTimeout(() => {
            success(["file_0", "file_1", "file_2"]);
        }, Math.random() * 1000);
    }
}

// function I would add to make the process simpler
const readFiles = (path) => {
    return new Promise(res => {
        fs.readdir(path, () => {}, (s) => {
            res(s);
        });
    });
}
// start helper as async
const start = async() => {
    // first await all files to be added to the archive
    await Promise.all(children.map(async child => {
        const files = await readFiles();
        // loop to add all files to the zip
        // archive.file(filePath, { name: `${value}/${file}` });
        console.log(child, files);
    }));
    // then archive all files
    // archive.finalize();
    console.log("finalize archive");
}
start();

Upvotes: 1

Related Questions