aaron
aaron

Reputation: 2056

fs.readFileSync - Error: EMFILE, too many open files 'D:\Workspace\some.json'

I've searched here for a long time, but did not get the answer, I just simply want to read 4000 json files in a loop and do something later,

try {
  data = JSON.parse(fs.readFileSync(file));
} catch (err) {
  console.error(err);
  next();
  return;
}

this is a so simple problem, why I can't find answer? I tried graceful-fs, still got the same problem.

Any suggestion? thanks very much!

Upvotes: 3

Views: 2993

Answers (4)

aaron
aaron

Reputation: 2056

I gave up openFileSync and use openFileSync instead.

    fs.readFile(file, function read(err, data) {})

Upvotes: 2

Kisaragi Hiu
Kisaragi Hiu

Reputation: 236

Sync functions are not covered by graceful-fs. EMFILE, which means the current process is out of file descriptors, is impossible to deal with during a Sync function. So graceful-fs does not make a difference.

It's weird though: readFileSync is supposed to open the file, read it, then close it. You have probably encountered an fd leak in your version of Node. It has probably been fixed between 2015 and now (2022), but as there is no version information nor the code for the actual looping part it is difficult to tell.

Upvotes: 0

dabobert
dabobert

Reputation: 939

I had this same problem where i was traversing folders and uploading files. I was only able to solve it by using the queueing up the files and reading them from a queue. I eventually went with async library

Upvotes: 1

santosh_Nxtech
santosh_Nxtech

Reputation: 1

You can use following option to avoid this problem....

var Filequeue = require('filequeue');

var fq = new Filequeue(200); // max number of files to open at once

fq.readdir('/path/to/files/', function(err, files) { if(err) { throw err;enter code here } files.forEach(function(file) { fq.readFile('/path/to/files/' + file, function(err, data) {enter code here` // do something besides crash } Make sure you installed filequeue npm here.

Upvotes: 0

Related Questions