Reputation: 11247
I am developing a webapp in ASP.NET / C# with MySQL and Entity Framework. I want to use some counter columns that will frequently be updated by incrementing their value +1. I have a product table where the counters columns reside. I have a two questions:
1) Should I use a second table for the counters? - The main product table will be read frequently, and if I update the row very frequently that can hurt performance, because the row will be locked each time the counters are being updates (I use InnoDB)
2) What is the best way to make a +1 increment on a row using MySQL and Entity Framework. I don't mind that the counter will be 100% accurate, so if there are some update conflicts, I can catch them but I can pass without an update. I assume that most updated will be done fast and be included in the update.
I am adding some code that I am using to implement this by utilizing static variable to prevent writing to the db frequently.
Code:
if (Global.DataCounters != null)
{
if (Global.DataCounters.ContainsKey(id))
{
int new_value = ++Global.DataCounters[id];
Global.DataCounters.Remove(id);
Global.DataCounters.Add(id, new_value);
// check datatime
if ((DateTime.Now - Global.LastFlushed).Minutes > 10)
{
// update counter in db and reset data
Global.LastFlushed = DateTime.Now;
}
}
else
{
// first time view of a device, set to zero views
Global.DataCounters.Add(id, 1);
}
}
else
{
Global.DataCounters = new Dictionary<int, int>();
}
Global.asax:
public static Dictionary<int, int> DataCounters = new Dictionary<int,int>();
public static DateTime LastFlushed;
void Application_Start(object sender, EventArgs e)
{
LastFlushed = DateTime.Now;
RegisterRoutes(System.Web.Routing.RouteTable.Routes);
}
Do you think it's a good implementation?
Thanks
Upvotes: 1
Views: 306
Reputation: 49245
Second table for counters would only be helpful if say significant product table reads (say > 50%) does not need counters information. Otherwise, you would always joining two tables making your reads expensive.
Based on MySQL doc: for updating counters, if your WHERE condition uses a unique index to search for a unique row (to update) then only row level lock would be taken (and no gap lock would be taken). This would be the best possible scenario for your case (assuming product id could be used as unique index). I will suggest using Entity SQL to fire the Update statement (or calling an SP that would do the same).
If you want to play in only entities terms (no EntitySQL or SP) then don't use transaction scope for this particular operation - that way, select for fetching the product entity would not lock and SaveChanges
would be a single update statement incrementing counters. Of course, use of two statements outside transaction means that there can be chances that update would fail (assuming that you have some time-stamp column that can be used for optimistic concurrency check) but you have indicated it to be acceptable.
Finally, if transactions are unavoidable while incrementing counters then try to make them shorter and try using lower isolation level such as Read Committed
.
Upvotes: 1