Thanks Doug,
I'm going to try to produce a sql copy-able file and get indexes/fkeys
later.

About debugging, i run the import script from the command line, but do
have DEBUG on in my settings file. Does this affect the postgresql
debug log?
Rahul

On May 7, 5:35 pm, Doug B <[EMAIL PROTECTED]> wrote:
> I think you may need to consider looking outside the ORM for imports
> of that size.  I was having trouble with only about 5 million rows,
> and got a reasonable speed up doing executemany.  It was still slow
> though.  I ended up redesigning to eliminate that table, but from what
> I read in trying to make it work a good approach would have been:
>
> drop the indexes
> import the data and convert to raw sql statements and save those to a
> file
> do a bulk load of the file
> add the indexes
>
> If you have debug enabled you can take a look at
> django.db.connection.queries after a small import using your models to
> get an idea of how to build the sql statements.  On a related note, I
> always seem to get bitten by forgetting to turn debug off when doing
> large imports.  All of the sql statements are logged in debug mode,
> and things can get ugly quickly for large datasets.
>
> There might be better ways to do it, I'm no DB expert but since you
> hadn't gotten a reply in a few days maybe something is better than
> nothing.
--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to