Callum Linington
Callum Linington

Reputation: 14417

Node Js delivering files to Client

When you send an HTML file to the client, it gets parsed and makes GET calls for anything else like <SCRIPT> or <LINK> (is <IMG> included in that? I think it is).

It is very expensive to do this process:

Is there anyway to cache the file contents in Node when the server starts up in order to simply do

I've got this function to go through all the scripts in the Scripts/ folder and read the files:

var gatherFiles = function () {
    fs.readdir('/Scripts', function (err, files) {
        files.filter(function (file) {
            return path.extname(file) == '.js';
        }).forEach(function (file) {
            // read files here
        });
    });
};

Bare in mind these points

I assume, that because the server should only be started once, that if all the files were cached in node at the beginning, then if they were ever needed then they can be served up from memory. The files are script files so they aren't massive, 100Kb max really (after minified probably much less).

I ask this, because when I've done the Chrome and Firefox network monitor to see how long it takes to serve out the files, and it takes a very long time, over 100ms....

I'm guessing there is latency across many parts of the server: ethernet, SD card, code.

So I think the way I want to do it with pre caching, it should minimise that 100ms lag.

* My Solution *

var fs = require ("fs"),
    path = require("path"),
    async = require("async"),
    scriptFiles = {};

function gatherFiles(callback, folder) {
    fs.readdir(folder, function (err, files) {
        var filteredFiles = files.filter(function (file) {
            var ext = path.extname(file);
            return ext == '.js' || ext == '.html';
        });
        aync.each(filteredFiles, function (file, asynCallback) {
            if (!scriptFiles[file])
                fs.readFile(folder + '/' + file, function (err, data) {
                    scriptFiles[file] = data;
                    asyncCallback();
                });
        }, function (err) {
            callback(scriptFiles);
        });
    });
};

exports.gatherFiles = gatherFiles;

The key points to note are:

usage:

var cacher = require("./cacher");
cacher.gatherFiles(function (fileDictionary) {
    http.createServer(onRequest, 8888);
}, './Scripts');

Upvotes: 3

Views: 2195

Answers (1)

Robert
Robert

Reputation: 21388

I think something like this would meet what you're looking for, which is trying to eliminate the disk read.

function Files = function () {
    var files = {};

    return function (filepath, callback) {
        // If cache is available, return that
        if (files[filepath]) return callback(null, files[filepath]);
        // Otherwise, get it, then store it
        fs.readFile(filepath, function (err, data) {
            if (err) return callback(err);
            files[filepath] = data;
            callback(null, data);
        });
    }
}();

Then call it as such

Files('/scripts/whatever.js', function (err, data) {
   // Do something
});

If you want to, you can add in other functions such as file watching or cache expiration you can.

If you want to cache it all in the beginning, you can just iterate through whatever directory and do nothing with the callback.

For that I'd modify it as such:

//...
if (files[filepath]) return callback && callback(null, files[filepath]); 
//...
callback && callback(null, data);
//...

The callback && callback(...) will be treated the same as if (false && alert("Nothing")) in that the alert will never be reached, because the falsy condition prevents further execution. As such, if you don't pass a callback, it will never try to execute the callback. Sort of a fail safe.

Upvotes: 3

Related Questions