Reputation: 2342
I created a DropDown autocomplete menu using jQuery. The main idea is that once a key is inserted into the textbox (3 chars minimum) the function sends a request using GET method to a diffrent page (lets say: search.php?q=iron man 3) and the page (search.php) creates using MySQL and PHP 5 results and shows them.
Seems ok up to here, but since I have many users in my website (about 10,000 users per day) that everytime they hit a key requires a query, it creates a mass overload on the server. which slows the server down.
Is there a smarter way of doing a DropDown Autocomplete menu?
Thank you.
Upvotes: 1
Views: 723
Reputation: 109
My atomic firewall was tagging my autocomplete script as a DDOS attack I got inside the jquery autocomplete script and changed it so it 1) Only looks for 3 letters then creates a list 2) Only queries if the query has not been sent This may not be the best way, but it stopped all the firewall issues and dropped my server load from issuing high load warnings, cutting it maybe 80%
function request(term, success, failure) {
if (!options.matchCase)
term = term.toLowerCase();
var data = cache.load(term);
// recieve the cached data
if (data && data.length) {
success(term, data);
// if an AJAX url has been supplied, try loading the data now
} else if( (typeof options.url == "string") && (options.url.length > 0) ){
var extraParams = {
timestamp: +new Date()
};
$.each(options.extraParams, function(key, param) {
extraParams[key] = typeof param == "function" ? param() : param;
});
// added by gary - the ajax call is made at 3 letters and only 1 call is made to avoid swamping the server with ajax requests
if(term.length>'3'){
stopLoading();
}
if(term.length=='3'){ // added by gary
var calls=$("#AUTOCOMPLETECALLS").val();
if(calls==0){ //by pass if a call already made
$.ajax({
// try to leverage ajaxQueue plugin to abort previous requests
mode: "abort",
// limit abortion to this input
port: "autocomplete" + input.name,
dataType: options.dataType,
url: options.url,
data: $.extend({
q: lastWord(term),
limit: options.max
}, extraParams),
success: function(data) {
var parsed = options.parse && options.parse(data) || parse(data);
cache.add(term, parsed);
success(term, parsed);
var numCalls=$("#AUTOCOMPLETECALLS").val();
numCalls++;
$("#AUTOCOMPLETECALLS").val(numCalls);
var idData = $(input).attr('id');
if(idData=='txtText'){
var SplitData=data.split("|");
if(SplitData[1]==undefined){
if(document.getElementById('donor_id') != null)
{
$("#donor_id").val('0');
$("#NewRecepientWarning").show();
}
} }
}
});
} // eof calls
} else { //eof term.length
// no match is selected the user just keeps typing
hideResultsNow();
var idData = $(input).attr('id');
if(idData=='txtText'){
if(document.getElementById('donor_id') != null)
{
$("#donor_id").val('0');
$("#NewRecepientWarning").show();
} }
}
} else {
// if we have a failure, we need to empty the list -- this prevents the the [TAB] key from selecting the last successful match
select.emptyList();
failure(term);
}
};
Upvotes: 0
Reputation: 833
I assume you have already normalized and streamlined the table you're querying. It would probably be best to just query a table that has a name field, some score (e.g. popularity) for ranking results (if applicable), probably a movie_id to link you to actual movie data or whatever you're querying, and maybe an identifier for what type of data (actor, movie, etc.).
You don't have to normalize everything, move this data to a separate table and run a join every time you want to see the name. This can just be an extra copy of the data. You can even use a storage engine you wouldn't normally use, like memory or MyISAM (see text search below), to improve performance.
Add is add some conditions on running the query. Don't run the query until they've stopped typing for a brief time (.5+ second or so vs the default of .3 second). This will be your easiest and quickest way to improve server stability, at the expense of responsiveness, since they have to stop longer. You could also maybe try to avoid a client running multiple simultaneous queries, which is an exercise for the reader, but with a proper delay, this may be useless. If you've come here from Google, you can view the appropriate jquery autocomplete documentation at http://docs.jquery.com/UI/API/1.8/Autocomplete.
$( ".selector" ).autocomplete({
minLength: 3, // already set
delay: 500
});
Another improvement is to go with a text search engine. MySQL has some fulltext searching with the MyISAM and InnoDB (new in 5.6) engines, but MyISAM comes with some trade offs and InnoDB fulltext might not give production ready results (as shown here). Fortunately, it's fairly easy to switch the table type in MySQL, and having a small table of just a copy of the names should mitigate any issues. Some databases make you just rebuild everything yourself for more complex DBA operations.
If your site becomes busy enough, a dedicated search engine would probably be a good idea, something like Lucene/CLucene (a c++ port), Solr (Lucene subproject), Sphinx or Xapian to name a few.
About the only real alternate method I see is to push the whole data set to each visitor, and have the client handle the queries and results all by itself in javascript. With HTML5 applications this might have some uses, but make sure the data gets cached.
Upvotes: 3
Reputation: 8156
That depends what you call smart.
You want to reduce your server load, there can be a few things done:
Set a longer delay.
This is one of the easiest thing you can do. The number of queries can be reduced by the delay
parameter:
$( ".selector" ).autocomplete({
delay: 500
});
The default is 300
milliseconds, you might want to set more.
Set a minLength
The other easy thing to do is setting a minimum length that needs to be inserted before the querying starts. Set it to a number larger than one. Adjust it to your taste.
$( ".selector" ).autocomplete({
delay: 500, //from before
minLength: 3
});
Optimise your queries.
You said that this is the most resource consuming part of your application. Make sure your queries are as optimal as possible. Depending on the database there are best practices to search for. Be sure to only query for the minimal amount of required data. If you want to show 5 results maximum for autocomplete, only query for 5 records. (In MySQL there is LIMIT
but I have heard that that can be optimised too).
Indexes in good places can drastically increase the speed of your queries. For example look at this article.
A cache table.
If you query a lot of tables joined together, consider making a really simple table where you only store the possible values for autocomplete. This way your queries will be much simpler. You will need to populate it beforehand though.
Upvotes: 2