On Tue, Aug 11, 2009 at 5:37 AM, Chris Spencer<chriss...@gmail.com> wrote:
>
> I guess I should have prefixed that by saying my goal is to migrate
> from MySQL to PostgreSQL. However, I'm having trouble finding a tool
> to do this, so I thought I'd try Django's backend neutral
> dumpdata/loaddata feature.
>
> Chris
>
> On Mon, Aug 10, 2009 at 9:48 PM, Malcolm
> Tredinnick<malc...@pointy-stick.com> wrote:
>>
>> On Mon, 2009-08-10 at 17:02 -0700, Chris wrote:
>>> I'm trying to dump a 3GB MySQL database using manage.py dumpdata, but
>>> it's getting killed after 2 hours. Is there any way to get it to use
>>> less memory/cpu so it doesn't get killed and completes the dump?
>>
>> Is there some particular reason you need to use dumpdata for this? At
>> some point, using the database's native tools is going to be a lot more
>> efficient and robust. Dumpdata is great for the sweet spot, but it isn't
>> designed to completely replace all existing database tools.
>>
>> Regards,
>> Malcolm
>>
>>
>>
>> >
>>
>
> >
>

Unless the majority of your data is from 1 table I'd try dumping each
application seperately into a few fixture files and loading them up
individually.

Alex

-- 
"I disapprove of what you say, but I will defend to the death your
right to say it." -- Voltaire
"The people's good is the highest law." -- Cicero
"Code can always be simpler than you think, but never as simple as you
want" -- Me

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to 
django-users+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to