Reputation: 388
I have a PHP script that shows multiple tables of data loaded via a JQuery AJAX call for a tournament scoring website. The tables are individual and team scores calculated in real-time.
Each table is checked once every minute for new scores and refreshed if needed. To do this, each table sends a quick ajax request to the last updated time for the scores... if the time is different than what the table currently has, the table is refreshed with new data.
My problem is that these tournaments can get quite large, with many participants, teams, divisions and scores and loading each table uses server resources. This is no problem if a handful of people are viewing the real-time stats, but once more than about 50 people are connected to this stats page, the server starts lagging heavily.
I would like to know what I can do to optimize the data, the ajax calls, or the server itself to run as efficiently as possible so that the server doesn't top out so quickly. I've already combed through the MySQL calls and tried to optimize everything as much as possible, but it doesn't seem to be enough.
Any ideas would be greatly appreciated!
Upvotes: 1
Views: 1534
Reputation: 30087
You could use a cache, for example in your architecture memcache would fit very well. Every time your server receive a request your php app try to get from cache the result, if not found execute the query.
Now you can choose two different way:
This example, is based on the easy approach and the cache key is based directly on the submitted SQL.
global $memcache;
$memcache = new Memcache;
// Gets key / value pair into memcache ... called by mysql_query_cache()
function getCache($key) {
global $memcache;
return ($memcache) ? $memcache->get($key) : false;
}
// Puts key / value pair into memcache ... called by mysql_query_cache()
function setCache($key,$object,$timeout = 60) {
global $memcache;
return ($memcache) ? $memcache->set($key,$object,MEMCACHE_COMPRESSED,$timeout) : false;
}
// Caching version of mysql_query()
function mysql_query_cache($sql,$linkIdentifier = false,$timeout = 60) {
if (($cache = getCache(md5("mysql_query" . $sql))) !== false) {
$cache = false;
$r = ($linkIdentifier !== false) ? mysql_query($sql,$linkIdentifier) : mysql_query($sql);
if (is_resource($r) && (($rows = mysql_num_rows($r)) !== 0)) {
for ($i=0;$i<$rows;$i++) {
$fields = mysql_num_fields($r);
$row = mysql_fetch_array($r);
for ($j=0;$j<$fields;$j++) {
if ($i === 0) {
$columns[$j] = mysql_field_name($r,$j);
}
$cache[$i][$columns[$j]] = $row[$j];
}
}
if (!setCache(md5("mysql_query" . $sql),$cache,$timeout)) {
// If we get here, there isn't a memcache daemon running or responding
}
}
}
return $cache;
}
Upvotes: 2
Reputation: 2908
Cache the output, every 1 or 5 mins, refresh the cache...
So what i mean is, simply write the database output to file, then all requests within that minute uses the file... Every request, check the file's age, if older than 1 or 2 or 5 (examples) mins, delete the cache file and rebuild it
This will allow mysql to rest during that time... the Ajax calls are just what needs to be done... as long as you feel your intervals are necessary
Some sample PHP:
$cache_expires = 2; # Minutes
if (file_exists("stats.cache") && filemtime("stats.cache")>time()-$cache_expires*60) {
# Read cache
$result = unserialize(file_get_contents("stats.cache"));
# Display result
} else {
# Query the database
$result = get_stats_from_database();
# Write cache
file_put_contents("stats.cache",serialize($result));
# Display result
}
Upvotes: 1