annemartijn
annemartijn

Reputation: 1644

How can I prevent two applications clashing, because they use one database?

Situation

There are two applications:

Both applications make use of one (mirrored) MSSQL database (SQL Server 2014). We query data using NHibernate ORM. Both applications and the database server have it's own dedicated server.

Problem

There are millions of records in the database, so query's only work if there are sufficient indexes available. The Windows Service mostly uses Clustered indexes for queries, but there are a lot of Non-clustered indexes for querying other fields in the web application. The Windows service should be as fast as possible.

Our solutions thus far

Question

Are there other solutions, besides our current efforts? We don't want to add extra servers, we want to solve this issue using new architecture or software. In an ideal world, use of the web application would not affect performance of the Windows Service at all.

Upvotes: 2

Views: 222

Answers (3)

Greg
Greg

Reputation: 4055

You didn't say what version of SQL Server you are on, nor did you describe your HDD scenario (shared storage, local storage, etc.), but you did say HDD was the bottleneck (not memory, correct?), but didn't say if it was due to writes, or writes + reads. However, since you have a requirement of being only a couple minutes behind, and you do not want to introduce a reporting server - If you are on SQL 2012, you could use AlwaysOn with Async Replica, just ensure the secondary replica is on a different set of disks than the Primary db. By separating the secondary db, the reads will not conflict with the writes, while still being only seconds behind the primary.

Upvotes: 2

Dan Field
Dan Field

Reputation: 21661

In addition to Blam's suggestions, you may want to look into caching reads for your ASP.NET application, probably using ASP.NET's cache class: https://msdn.microsoft.com/en-us/library/system.web.caching.cache%28v=vs.110%29.aspx - see https://msdn.microsoft.com/en-us/library/vstudio/ff477235%28v=vs.100%29.aspx as well for an implementation sample.

If you can tolerate dirty reads (which may or may not be the case - would it be ok for the application user to see partially updated data?), you could combine this with SELECT ... WITH(NOLOCK).

The basic idea would be that your database will only get hit by the WebApplication every 5 minutes or so (but for a larger read) instead of in real time. The question here would become how much data you have to cache every 5 minutes. If you can cache a significant portion of the data that gets used by your users in a reasonably sized cache, great. There are a few strategies you could take here:

  • Cache only data that doesn't change that often, and update that cache less frequently
  • Cache only data related to an object/entity that has been recently updated
  • Cache only data that is frequently requested by the site

All of these would require some analysis of when and how your ASP application is hitting the DB.

You could combine this with a polling pattern as well for further improvements, i.e. when the Service updates an entity in the database, it also creates a record in another table that your ASP.NET application is polling to keep the cache up to date. The polling table would contain a master key (or set of keys) to access the entity, so your ASP cache updater could request only those entities that are out of date.

Upvotes: 1

paparazzo
paparazzo

Reputation: 45096

Not an ideal world. Any activity on a table is going to effect other activity.

As you know indexes slow down insert, update, and delete.

Index maintenance will help (defrag)

Fill factor less than 100 will help (like 50)

Try and perform insert in the order of the PK
If not the PK in the order of as many indexes as possible

Do set base update, insert, and delete when possible
Way more efficient to get one write lock and 1000 inserts than one at a time

The primary impact of the Web Application is that is takes locks.
Do you have the option of select .... with (nolock)?
With (nolock) is dirty read so only use with caution.

Upvotes: 2

Related Questions