If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to REGISTER before you can post. To start viewing messages, select the forum that you want to visit from the selection below.
prepare your postgresql database with the schema (not the data)
make a script (perl, python, php, whatever) which sequentially selects all from a mysql table and dumps that into the correct postgresql table.
(you might want to make it so its selects /inserts 10k rows a times to prevent memory issues).
There are commercial products which can also do this, but they assume they have to migrate the schema as well. Better steer clear of those.
prepare your postgresql database with the schema (not the data)
make a script (perl, python, php, whatever) which sequentially selects all from a mysql table and dumps that into the correct postgresql table.
(you might want to make it so its selects /inserts 10k rows a times to prevent memory issues).
There are commercial products which can also do this, but they assume they have to migrate the schema as well. Better steer clear of those.
Or just use mysqldump --compact -c -e -n -t --compatitable=postgresql
I have done a recent migration of version 2.2.1 from MySQL to PostgreSQL. I've described the process on my blog, so instead of writing explanations again here, I'll just put a link:
Three evil giants of the south are constantly on the attack With lies and fire from their mouths but we always send them back (Amon Amarth – Guardians of Asgaard) Sooner or later someone will…
This tool is feature-reach and easy to use. It maps data-types, migrates constraints, indexes, PKs and FKs exactly as they were in your MySQL db. Under the hood it uses PostgreSQL COPY, so data transfer is very fast.
Comment