Reputation: 10156
In the past I had asked a few questions about how to create my own API to allow external sites access to my db I store in the database. This particular question is I'm asking for suggestions on how to optimize the API access for external sites to help keep the local server load down. First the way the API is setup is it uses MySQL that holds the data and php to serve the data. The API itself is called by a URL with an API key in the URL. For example:
http://mysite.com/get_listings.php?key=somekey
Then my script does what it needs to do for the action above and then return the results in json format. Currently the API is serving 10-15k unique calls a month and I am adding another group of sites that will bring in another 20-30k unique calls a month on top of what the API is serving now. There is no caching going on, its just a straight call to my API via the php script and it queries the MySQL database every call. The type of data that it returns is basically stories and profile information and the calls are things like latest stories, profile of the story author and story details.
Besides the normal optimization of php code and optimizing MySQL queries and indexes can you suggest any other optimization techniques that can help keep the server load down and still serve the API to the external sites fast? I understand the external sites themselves should be doing some sort of caching on their end as well. But on my end do you suggest caching the results from the db then serving the cache? If so would using memcache be a good choice? Or perhaps storing the json results in redis? Or perhaps a simple file caching system would work? Anything other than caching? Obviously server hardware can make a difference but out of the scope of this post. I will say though that I will be getting a whole new server that will only be dedicated to doing this.
I do have root access to install additional software, if that helps.
Upvotes: 2
Views: 1548
Reputation: 10664
Unless you have the capacity to scale mysql with additional read-only nodes, you should really start using Redis or Memcached. Redis might be better at handling complex data-types. Here is a recipe.
If you are using redis for permanent storage, make sure that your PHP knows how to clean it up. In my experience it turned out to be quite resilient, although I still store data in MySQL .
Redis is great if you understand computation complexities.
Upvotes: 3