Reputation: 10571
We currently have the following task ahead, and I'm struggling to find what I believe should be out there somewhere.
A datamodel is stored in MySQL via Hibernate. Everything works fine and the product is shipped.
For a planned future version though changes to the datamodel are required. We do not yet know what changes and how drastically, but we do know that customer databases have to be updated accordingly.
This will most likely happen again on future versions of the future version, which brings me to part one of the question:
I can see the following advantages of this approach:
The actual problem, however, assuming this approach will be followed, is that we do not want to write additional code for generating the corresponding XML files. Optimally, I'd like to use a database that stores its contents in a XML file and provides a JDBC interface. Does anyone here know of how to achieve this with the minimal effort?
The workflow I have in mind would be the following:
Feasible? Is there a totally better way to solve the problem? And where can I find a JDBC-supporting XML database?
Upvotes: 3
Views: 3156
Reputation: 17846
An easy solution could be to to a mysqldump --xml ( http://dev.mysql.com/doc/refman/5.1/en/mysqldump.html#option_mysqldump_xml ), which generates data like:
<mysqldump xmlns:xsi="http://www.w3.org/2001/XMLSchema-instance">
<database name="world">
<table_data name="City">
<row>
<field name="ID">1</field>
<field name="Name">Kabul</field>
<field name="CountryCode">AFG</field>
<field name="District">Kabol</field>
<field name="Population">1780000</field>
</row>
You can then transform this data and load it again table per table using 'LOAD XML INFILE' http://dev.mysql.com/doc/refman/5.5/en/load-xml.html.
As an alternative, I heard some colleagues having nice results with Liquibase: a database refactoring tool. This allows you to script your database migrations in a descriptive way using refactoring commands as listed in http://www.liquibase.org/manual/refactoring_commands.
For example:
<databaseChangeLog>
<changeSet id="1" author="greyfairer>
<addLookupTable
existingTableName="address" existingColumnName="state"
newTableName="state" newColumnName="abbreviation"
constraintName="fk_address_state"
/>
Liquibase also keeps track of all changesets that were ever applied in a dedicated changeset table. So if you had changesets 2a and 2b applied in version 2 of your product, and 3a and 3b for version 3, Liquibase can detect that e.g. they all 4 need to be applied if upgrading from version 1, and only 3a and 3b for version 2, because 2a and 2b are already in the changeset table.
Granted, these solutions do not reflect your domain model, but sometimes it's easier working in database terms rather than in domain terms when doing such migrations.
Upvotes: 1
Reputation: 9073
The better way to accompolish this is not to use Hibernate IMO.
We had a solution using nHibernate and SqlServer so your solution will be different but I'm sure relevant tools could be applied to achieve the same goal.
To upgrade from R1 to R1.1 we'd generate the new schema into an empty 'build' db using the nHibernate mapping files (say db1.1). We'd then take a copy of the current release (db1.0) and used a db schema tool (we used DbGhost) to then upgrade db1.0 to db1.1.
DbGhost (and other similar tools) will automatically add for non-null additions and new tables and provide injection points for custom sql for each set of wholesale changes.
It worked a dream and worked in our multi-developer, mutli-environment just fine. A real bonus was that for final release the DbGhost call could be changed to produce a single upgrade scipt summarizing all the chnages.
Hope it helps.
Upvotes: 0
Reputation: 116502
I do not think it would make any sense for XML databases (like eXist) to implement JDBC, since JDBC is directly based on dealing with SQL and relational model. Further, I am not sure I see what the planned benefit of XML is at all; XSLT is ok transformation language for XML content. But in this case what you have at most is XML wrapping of relational data, not tree-formed deeply nested textual data. So you might be better off just writing Java code to do changes, row by row. This is a fairly common practice.
You can of course use XML or JSON as the intermediate storage/buffering format, i.e. handle import and export. But processing in-between does not necessarily benefit from XSLT or temporary database.
Upvotes: 1