Thiago Belem
Thiago Belem

Reputation: 7832

Complex search workaround

Before anything, this is not necessarily a question.. But I really want to know your opinion about the performance and possible problems of this "mode" of search.

I need to create a really complex search on multiple tables with lots of filters, ranges and rules... And I realize that I can create something like this:

  1. Submit the search form
  2. Internally I run every filter and logic step-by-step (this may take some seconds)
  3. After I find all the matching records (the result that I want) I create a record on my searches table generating a token of this search (based on the search params) like 86f7e437faa5 and save all the matching records IDs
  4. Redirect the visitor to a page like mysite.com/search?token=86f7e437faa5

And, on the results page I only need to discover what search i'm talking about and page the results IDs (retrieved from the searches table).

This will make the refresh & pagination much faster since I don't need to run all the search logic on every pageview. And if the user change a filter or search criteria, I go back to step 2 and generate a new search token.

I never saw a tutorial or something about this, but I think that's wat some forums like BBForum or Invision do with search, right? After the search i'm redirect to sometihng like search.php?id=1231 (I don't see the search params on the URL or inside the POST args).

This "token" will no last longer than 30min~1h.. So the "static search" is just for performance reasons.

What do you think about this? It'll work? Any consideration? :)

Upvotes: 1

Views: 262

Answers (4)

Hugo Mota
Hugo Mota

Reputation: 11577

Sphinx seems to be a nice solution if you have control of your server (in a VPS for example).

If you don't and a simple Full Text Search isn't enough for you, I guess this is a nice solution. But it seems not so different to me than a paginated search with caching. It seems better than a paginated search with simple url refered caching. But you still have the problem of the searches remaining static. I recommend you flush the saved searches from time to time.

Upvotes: 0

Lex
Lex

Reputation: 1378

Your system may have special token like 86f7e437faa5 and cache search requests. It's a very useful mechanism for system efficiency and scalability.

But user must see all parameters in accordance with usability principles.

So generating hash of parameters on the fly on server-side will be a good solution. System checks existanse of genereted hash in the searches table and returns result if found. If no hash, system makes query from base tables and save new result into searches table.

Upvotes: 2

goncin
goncin

Reputation: 1

TiuTalk,

Are you considering keeping searches saved on your "searches" table? If so, remember that your param-based generated token will remain the same for a given set of parameters, lasting in time. If your search base is frequently altered, you can't rely on saved searches, as it may return outdated results. Otherwise, it seems a good solution at all.

I'd rather base the token on the user session. What do you think?

@g0nc1n

Upvotes: 0

Riedsio
Riedsio

Reputation: 9926

Seems logical enough to me.

Having said that, given the description of you application, have you considered using Sphinx. Regardless of the number of tables and/or filters and/or rules, all that time consuming work is in the indexing, and is done beforehand/behind the scene. The filtering/rules/fields/tables is all done quickly and on the fly after the fact.

So, similar to your situation, Sphinx could give you your set of ID's very quickly, since all the hard work was pre-done.

Upvotes: 1

Related Questions