Reputation: 692
I'm not sure if this is possible, I'm assuming something like this is. I've got a couple of JSON files containing a series of JSON objects, each with a unique ID. I'd like to be able to use a Node application to read these objects and return all of the objects with an ID greater than a parameter provided.
ie. /after/20 would return all of the JSON objects from the file whose ID is greater than 20.
These requests will be happening very frequently, by lots of different users. Therefore, I was wondering if there is a way to store or cache these files in some way which means they don't need to be repeatedly loaded. Therefore making the process both quicker and more efficient.
Upvotes: 0
Views: 111
Reputation: 5796
Here is a solution that will keep the whole dataset in memory. I am assuming you have at least node 0.5 (to require json), that the two json files are named json1
and json2
and I will use express because of its routing.
var express = require('express'),
http = require('http'),
json1 = require('./json1.json'),
json2 = require('./json2.json');
var app = express();
//Express config
app.configure(function(){
app.set('port', 8080 );
//...
app.use(express.bodyParser());
app.use(express.methodOverride());
//Right before the router, add a middleware to expose your objects
app.use(function(err, req, res, next){
req.json1 = json1;
req.json2 = json2;
next();
});
app.use(app.router);
//...
});
Finally, any router will have access to both json objects.
app.get('/after/:id', function(req, res){
var objects = req.json1,
id = req.params.id,
result = objects.filter(function(obj){ return obj.id >= id; });
res.json(200, result);
});
Upvotes: 1
Reputation: 52
As far as I know, if you use node.js's "require" function to load a module (or JSON file, in your case), it is cached based on its filename.
See Node.JS API Doc about this.
I am using this technique in a project with many small files and they all get cached after their first load.
Upvotes: 0