Reputation: 40062
We have an API (ASP.NET Web Api 2, to be specific) which needs to traverse through a very large collection of entities in some algorithmic fashion in order to research the desired result.
This collection is the same for every query performed. It forms the data "source" for every call.
Properties of the collection:
Reading this collection from disk or database every time a request to the API is made results in a lot of overhead. So, we persist it statically, almost as a singleton. This seems to work as static objects are shared between all requests to the application and have the same lifetime as the application domain. This results in almost instant requests from the API.
Is there perhaps a better pattern, practice or framework for such a problem?
Upvotes: 1
Views: 751
Reputation: 39045
You can use cache servers like Redis or mecmached.
In this SO you have a comparison between them, which will also explain what they're all about, and what the difference is with your current implementation: Memcached vs. Redis?
And, of course, here you have the official web sites for each:
There is even a third competitor: hazelcast.
Upvotes: 1
Reputation: 3048
If you really never need to refresh your data, your solution is ok.
But only until you have to scale to more than one web server. Reached this point you'll probably need to be sure that all the web servers have the same data, and - depending on you environment and architecture - this could be tricky. Perhaps if you'll ever reach this point, you will ask another question on SO...
Upvotes: 1