Reputation: 858
I have a transaction which inserts huge amount of data (over 5000+ records) into 3 tables (so it takes a while), which is OK, but I noticed something disturbing. All 3 tables seems to be locked during that process.
Insert transaction:
using(var db = new MyEntities())
{
using (var tr = db.Database.BeginTransaction())
{
// blah, blah, a lot of entities to create, **some custom SQL, bulk inserts** etc.
db.SaveChanges();
tr.Commit();
}
}
I'am unable to read during save from tables whether it's EF or raw SQL query. But selecting with (NOLOCK) works:
SELECT * FROM Table1 WITH (NOLOCK)
First of all I'd like to know what is actually happening? Is it just the "natural" behaviour and tables are locked no matter what or maybe it's lock escalation or something else.
Second, is it possible to disable locks on the level of insert transaction? Or do I have to add "WITH (NOLOCK)" to all SELECT queries?
Third, is it possible to make SELECT queries return data during saving, but ONLY records from commited transactions (for ex. my insert transaction already added 2000 records, 3000 pending, I make a select query and it return all records except those 2000, which are not yet commited).
Upvotes: 0
Views: 5190
Reputation: 294427
The tables are not locked. The specific records being inserted are locked. You are noticing blocking because you attempt to read these newly inserted rows, before they are committed.
Enable read committed snapshot in your database.
ALTER DATABASE <yourdb> SET READ_COMMITTED_SNAPSHOT ON;
There are more details for those that want to learn like possible lock escalation, missing indexes causing scans and the cost of row level isolation.
Upvotes: 1
Reputation: 2254
NOLOCK
hint only when necessary; someone is calling that a bad habit to kick. have a look at TRANSACTION ISOLATION LEVEL that may be useful while dealing with such issues in a more controlled manner.Upvotes: 0