Reputation: 5875
I've just started doing some real-world performance testing on my Fluent NHibernate / SQLite project, and am experiencing some serious delays when when I Commit to the database. By serious, I mean taking 20 - 30 seconds to Commit 30 K of data!
This delay seems to get worse as the database grows. When the SQLite DB file is empty, commits happen almost instantly, but when it grows to 10 Meg, I see these huge delays.
The database has 16 tables, averaging 10 columns each.
One possible problem is that I'm storing a dozen or so IList<float>
members, but they are typically only 200 elements long. But this is a recent addition to Fluent NHibernate automapping, which stores each float in a single table row, so maybe that's a potential problem.
Any suggestions on how to track this down? I suspect SQLite is the culprit, but maybe it's NHibernate?
I don't have any experience with profilers, but am thinking of getting one. I'm aware of NHibernate Profiler - any recommendations for profilers that work well with SQLite?
Edit
Some more testing indicates that it's not the size of the database that causes the slowdown - it's dependent on how many Saves I've done since I started my program. Commit time for the first Save is about 300 ms, and goes to over 1000 ms by the 50th Save.
I'm keeping a single Session open all the time - maybe I need some explicit Flush logic?
Also, I downloaded the NHibernate Profiler. It gave me an alert about "Large number of individual writes" for my IList members. The alert description suggested turning on batching, but SQLite does not support that, as far as I can tell.
Also mentioned Multiquery support, so I'm going to read up on that.
/Edit
Here's the method that saves the data - it's just a SaveOrUpdate call and a Commit, if you ignore all the error handling and debug logging.
public static void SaveMeasurement(object measurement)
{
// Get the application's database session
var session = GetSession();
using (var transaction = session.BeginTransaction())
{
session.Save(measurement);
transaction.Commit();
}
}
Upvotes: 3
Views: 4916
Reputation: 5875
Edit After reading zoidbeck's answer, I've abandoned this approach. See my comment to his answer. /Edit
After some Googling, I came across some posts that implied that the NHibernate cache could actually slow things down considerably under certain circumstances. Apparently, NH can spend more time looking things up in the cache than doing the actual Commit.
I tried doing a session.Clear after every Commit, but that broke lazy loading.
The solution (for now) was to call:
session.Evict(measurement);
after each Commit, which deletes the measurement from the cache. Now, my commits take roughly the same time (about 800 ms, for 30 K of data), no matter how big the DB is.
The root problem seems to be my IList<float>
members, which generate hundreds of SQL Inserts. Batching would probably solve this, but, alas, is only supported by SQL Server, not SQLite.
At some point, I'll have to optimize this - perhaps by storing them as a BLOB. But that's a question for another day.
Upvotes: -1
Reputation: 4151
I think using session.evict() only covers the symptom. You haven't posted the GetSession()-Method but the comment above and the remark that session.clear() breaks your lazy loading makes me guess you are using one session for the whole application.
This is extremely inefficient and will slow down your app the longer it runs. Creating a new session on the other hand is really cheap and will provide you with a clean and fast session only handling the objects you want to.
IMO you should think about declarative transaction management. Personally i prefer Springs-TX-Management but there are also other clever solutions that aren't that method-based like for example castle's ActiveRecord.
Upvotes: 3
Reputation: 14461
Nhibernate is never going to be fast.
Some older versions of nhibernate aggressively close the session to the database. This causes it to open/close the sqlite database a LOT (which is very slow). I believe newer versions include an option to keep the nhibernate session open longer. If you're using a memory database it causes everything to be lost as soon as it closes.
Problem using SQLite :memory: with NHibernate
Upvotes: 1
Reputation: 10180
You say that you have a List of measurement objects, but you are saving them 1 at a time. This would be fine, if all of the saves were in 1 transaction, but you have each save wrapped in a transaction. This will hurt your performance no matter how large the DB is.
Upvotes: 2
Reputation: 49261
Are you using autoincrement primary keys? This will cause NHibernate to perform a select after each insert to populate the primary key property on the object. I'm not sure how it does it for SQLite but your statement that the problem gets worse as the database grows points to this as a probable root cause.
Your plan to use a profiler is the best course of action. The NHibernate Profiler is excellent and has a trial period that will allow you to use it to troubleshoot this problem.
Upvotes: 1
Reputation: 16864
Your transaction should be around the whole operation, not each save.
You would also then benefit from enabling ado.net batching: 1000
Upvotes: 4