Reputation: 478
Could you please share your experience with NHibernate schema generation? How much scalable it is in terms of complexity and size of the data model? Does it have any major performance implication compared to hand crafted data model?
Upvotes: 1
Views: 279
Reputation: 61
If you need to export your schema and populate your database, you would like to see the Fluent NHibernate Schema Tool. It is capable to read your assemblies, hibernate.cfg.xml, *.hbm.xml and Fluent Mappings. You can generate/execute the DDL of your database (create/update/drop tables) and it accepts a CSV-like input file to be used for populating the created/updated database (the dataset file accepts small queries to be done in HQL). This tool is very useful for unit testing and Web applications that use NHibernate.
See more: https://bitbucket.org/guibv/fnst/wiki/Home.
Upvotes: 1
Reputation: 64628
I would say that there aren't any performance implications. In fact aren't there many options how to create the tables to fit the mapping files. There are some additional features just for schema creation, like the possibility to specify database data types, create constraints and indexes as well as running arbitrary sql when creating the schema.
Performance tuning can usually be done after automatically creating the schema. For instance, you let NH create the tables and run some Alter Table
statements to set some performance relevant settings. It is also very easy to create (or replace) indexes afterwards. All this could even be written to the mapping files. The hard work is still done by NH: creating all the tables and columns according to the information that is already there: the mapping files.
Upvotes: 0
Reputation: 1609
I've found it immensely useful for development, when you can use it with a bit of code to rebuild and repopulate test databases at will. Michael's point about migrations matches our experience - once you've made the initial release you'll need to decide on another method for altering production databases.
FWIW, we've used NH schema generation with about 30 models of the usual kinds (including a table per subclass arrangement), and the definitions that it generates are correct, so there's no obvious limit to the size of the schema that it could handle.
I now tend to think that an automatically generated schema is almost always a better starting point than a hand-crafted one, because the software will give you something that is totally consistent and exactly what you specified. The kinds of optimizations that a skilled DBA can do aren't likely to be necessary or useful until after you have a large, specific workload to tune for.
Upvotes: 2
Reputation: 22424
You are comparing Apples and Pears. Hand crafted model will always (well should) out perform any ORM technology.
I personally think NHibernate performs well and will map virtually any OO model to a relational model, that's the beauty of it. There are a few gotcha's like being aware of application start up time and making sure you are using session management correctly.
I would recommend NHibernate and have been using it for 18 months now on schemas that hold around 80 tables or so and have not yet seen any major issues.
Upvotes: 0