Reputation: 8424
Say I have:
var fs = require('fs');
var a= fs.readFile(__dirname + '/someBigFile.txt', function(err, data) {
console.log(data);
});
I've noticed that, if it first takes, say, 2 seconds to read someBigFile
, now it takes several milliseconds. Is there some internal caching is Node.js happening when you read it multiple times? for example:
for(var i =0; i < 10000; i++)
{
var greet = fs.readFile(__dirname + '/greet.txt', "utf-8", function(err, data) {
});
}
Upvotes: 7
Views: 6309
Reputation: 14809
The fs
module does not cache what it has read.
But if you need a cached version and you’re (luckily) trying to read JSON files caching, you can use require
for that. It does read, and cache.
Upvotes: 0
Reputation: 430
So I ran into this problem but in a different context, I wrote a node js server that reads a text file on a network drive that is essentially a read-only to me, however, another application occasionally writes to that file and I'd like to immediately see these changes. My Node js app kept reading a stale version of the file although I can see the file is changed.
I solved it after reading the help and added an 'rs' which bypasses the system local file cache https://nodejs.org/api/fs.html#fs_file_system_flags
fs.readFile( path, {"flag": 'rs'} , (err, data) => {} )
Upvotes: 4
Reputation: 8325
Simple answer is No - It's not internal caching like in node.js.
But you can wrap fs
module in a cached read-only version to speed up things if you're reading the same files and directories multiple times, and things don't change on disc.
Upvotes: 1
Reputation: 1029
No. It likely the your storage system cache (in some cases it can be and OS or some additional software cache). Try to write some data to file and try to read it again.
Upvotes: 1