Reputation: 335
I would like to convert my Django project's database from MySQL to PostgreSQL. Unfortunately, I can't just use Django's "dumpdata" and "loaddata" management commands because my database tables are too big. I already read the article http://www.ofbrooklyn.com/2010/07/18/migrating-django-mysql-postgresql-easy-way/, but it seems that to use this method in practice I'll still need to turn it into something like a management command that iterates over available models when it copies model instances and when resets sequences. Besides, it doesn't seem very fast either because it issues save() on each model instance.
Is there a better way to migrate? Actually, I would prefer to do "mysqldump", convert the dump from MySQL to PostgreSQL format somehow and then load it to PostgreSQL. What piece of software would you recommend that could perform such dump conversion and correctly convert from MySQL data types to PostgreSQL ones, for example tinyint(1) to boolean?
Edit Thanks everyone for your help. I successfully migrated my database using https://github.com/maxlapshin/mysql2postgres utility. However, I still had to reset sequences in the resulting PostgreSQL database myself after importing the dump.
Upvotes: 4
Views: 2212
Reputation: 174624
If your dataset is large and you need to do any transformation you can use open source solutions like talend studio and the kettle project from pentaho to create a map and it will take care of the rest; but this might be overkill unless you want to do any transformation as well (or your data set is really, really big).
EnterpriseDB (the company behind postgresql) provide a data migration studio product that does this along with a guide if you want to do it yourself.
The py-mysql2pgsql python package is another option.
Upvotes: 0
Reputation: 6499
There are several converters for it, for example this one in Ruby: https://github.com/maxlapshin/mysql2postgres
Upvotes: 3