Reputation: 10236
I am working with a table having 1500 records, I query for comments columns using 'LIKE '%word%', That is working good for now. but as comments are increasing day by day. I am thinking to use Full text search or some tool like http://sphinxsearch.com.
What is the limit of records in the table when full text search or search engine (spinxsearch) do the job effectively also I think, size of field also matters, I am working with comments so its fine but if its with long long articles then I may realize the power of full text search or sponxsearch.
Thanks.
Upvotes: 1
Views: 210
Reputation: 441
'LIKE' use regex to find the right row, this can become very very slow very fast, switch to sphinx, its very good.
Upvotes: 0
Reputation: 26699
At the moment when dataset (the whole table in your case) cannot fit in the memory, the indexing (hence the full-text index / sphinx/lucene) make a big difference.
Also, on a highly-concurrent, mixed write/read load, it makes a big difference, because in your query, the whole table have to be scanned, and this involves locking, scanning of redo logs, etc. (depending on transaction isolation level)
Upvotes: 1