Reputation: 397
Current I have a large data that stored inside database.
This set of data (3000 records) will be retrieve by user frequently.
Currently the method that I'm using right now is:
Search result from this cache object base on query
CachePostData.Select(string.Format("Name LIKE '%{0}%'", txtItemName.Text));
Bind the result to a repeater (with paging that display 40 records per page)
But I notice that the performance is not good (about 4 seconds on every request). So, I am wondering is there any better way to do this? Or I should straight away retrieve the result from the database for every query?
Upvotes: 2
Views: 2327
Reputation: 124746
DataTable.Select
is probably not the most efficient way to search an in-memory cache, but it certainly shouldn't take 4 seconds or anything like to search 3000 rows.
First step is to find out where your performance bottlenecks are. I'm betting it's nothing to do with searching the cache, but you can easily find out, e.g. with code something like:
var stopwatch = Stopwatch.StartNew();
var result = CachePostData.Select(string.Format("Name LIKE '%{0}%'", txtItemName.Text));
WriteToLog("Read from cache took {0} ms", stopwatch.Elapsed.TotalMilliseconds);
where WriteToLog
traces somewhere (e.g. System.Diagnostics.Trace, System.Diagnostics.Debug, or a logging framework such as log4net).
If you are looking for alternatives for caching, you could simply cache a generic list of entity objects, and use Linq to search the list:
var result = CachePostData.Select(x => x.Name.Contains(txtItemName.Text));
This is probably slightly more efficient (for example, it doesn't need to parse the "NAME LIKE ..." filter expression), but again, I don't expect this is your bottleneck.
Upvotes: 3
Reputation: 10859
It is a bit hard to declare a correct solution for your problem without knowing how many hits on the database you actually expect. For most cases, I would not cache the data in the ASP.NET cache for filtering, because searching by using DataTable.Select
basically performs a table scan and cannot take advantage of database indexing. Unless you run into really heavy load, most database server should be capable of performing this task with less delay than filtering the DataTable
in .NET.
If your database supports fulltext search (ie. MSSQL or MySQL), you could create a fulltext index on the name column and query that. Fulltext search should give you even faster response types for these types of LIKE
queries.
Generally, caching data for faster access is good, but in this case, the DataTable
is most likely inferior to the database server in terms of searching for data. You could still use the cache to display unfiltered data faster and without hitting the database.
Upvotes: 2
Reputation: 172488
I think using datatable would be more efficient as by doing that you will reduce the hits on your database server. You can store datatable in cache and then to reuse it. Something like this:-
public DataTable myDatatable()
{
DataTable dt = HttpContext.Current.Cache["key"] as DataTable;
if(dt == null)
{
dt = myDatatable();
HttpContext.Current.Cache["key"] = dt;
}
return dt;
}
Also check SqlCacheDependency
You may also clear the cache on some particular time interval like this:-
HttpContext.Current.Cache.Insert("key", dt, null, DateTime.Now.AddHours(2),
System.Web.Caching.Cache.NoSlidingExpiration);
Also check DataTable caching performance
Upvotes: 2