I guess I should have prefixed that by saying my goal is to migrate
from MySQL to PostgreSQL. However, I'm having trouble finding a tool
to do this, so I thought I'd try Django's backend neutral
dumpdata/loaddata feature.

Chris

On Mon, Aug 10, 2009 at 9:48 PM, Malcolm
Tredinnick<malc...@pointy-stick.com> wrote:
>
> On Mon, 2009-08-10 at 17:02 -0700, Chris wrote:
>> I'm trying to dump a 3GB MySQL database using manage.py dumpdata, but
>> it's getting killed after 2 hours. Is there any way to get it to use
>> less memory/cpu so it doesn't get killed and completes the dump?
>
> Is there some particular reason you need to use dumpdata for this? At
> some point, using the database's native tools is going to be a lot more
> efficient and robust. Dumpdata is great for the sweet spot, but it isn't
> designed to completely replace all existing database tools.
>
> Regards,
> Malcolm
>
>
>
> >
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to 
django-users+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to