On 14/06/10 04:15, Jeff Green wrote:
I was wondering what would be the best method of trying to dump a
large database.
I am trying to migrate from postgresql to oracle, using django 1.2.1 I
have been unsucessful in using dumpdata.
One interesting point - django 1.2 supports multiple connections.
I believe (though haven't tried) that it should be able to
simultaneously connect to both postgresql and oracle. So with a little
python glue, you should be able to live-copy from one db to the other,
assuming you have the resources to have both dbs running at once (if you
can afford oracle, I guess you do...)
I used a nasty hack job at one stage with django 1.1 to suck data from
mysql (ewww) into postgresql (yay), I imagine it will be easier now
that multiple connections are officially supported. We first tried
various perl scripts that munge the sql-level textual dumps, but the
live copy worked better (though slower) due to the cross-database
datatype abstraction work in django (and the goal was to use the data
with django anyway).
http://docs.djangoproject.com/en/1.2/topics/db/multi-db/
--
You received this message because you are subscribed to the Google Groups "Django
users" group.
To post to this group, send email to django-us...@googlegroups.com.
To unsubscribe from this group, send email to
django-users+unsubscr...@googlegroups.com.
For more options, visit this group at
http://groups.google.com/group/django-users?hl=en.