I've done this with a massive database. It wasn't pretty, but is
possible.

Here are my very brief notes on the conversion. I used phpMyAdmin to
export the data, and the postgresql module in Webmin or phpPgAdmin.
Webmin became necessary for large tables as phpPgAdmin would fail/
timeout or whatever.

------------------------
Importing from MySQL -> PostgreSQL

1- export data in CSV format from MySQL.
    - set "fields terminated by:" to ','
    - check the "put field names at first row" box
    - if they don't already match when displayed,
        change the field names to match the import table in postgres

2- import to postgresql:
    - browse to the table that needs to be populated
    - click 'import' in the top menu
    - for format, choose "CSV"
    - for "allowed null characters" choose "NULL (the word)"
    - browse for the file and import.
---------------------

In many cases, I had to build a custom table in MySQL that matched the
django-generated tables in PostgreSQL before exporting. There was a
lot of custom SQL involved in some of the hairier table relationships
as the conversion was not a 1:1 match. I made some modifications and
optimizations along the way for the new django system, so had to work
that into the conversion.

Hope that helps.


--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to