Agreed, not sure my suggestion will help you. You can have a look at
turning off transaction, that might make your code faster but I'm
really not sure.

( Check-out those
from django.db import transaction
@transaction.commit_manually
)

- Benjamin

On Thu, Aug 20, 2009 at 9:58 PM, Doug Blank<doug.bl...@gmail.com> wrote:
> On Thu, Aug 20, 2009 at 10:40 PM, Benjamin Sergeant <bserg...@gmail.com>
> wrote:
>>
>> (the answer was already sent, raw SQL, anyway)
>>
>> (with postgresql) I would try to load just a small bit of datas with
>> the slow create django object / save it process, then do a pg sql
>> dump, and look at the sql that pg dump generated. Then code a python
>> script that generate that.
>>
>> And then do a pg load which is super fast.
>
> Let me be more specific about what I am looking for: I am writing an
> application that could have any django-supported backend. I have a two pass
> importer that fills in tables (the program does a lot of processing and data
> manipulation in these import passes). The application is written in django.
> I am looking for a way to speed up the first pass of the process that
> doesn't need indexes (of course, I'd speed up all of it if I could).
>
> I do not want to rewrite what I have as raw SQL, I am just looking for
> anything that I can do, such as turn off indexing, write to the db in
> batches, or similar django-level optimizations.
>
> I don't think what you mentioned can help in this situation, right?
>
> -Doug
>
>
>>
>> - Benjamin
>>
>>
>>
>>
>> On Thu, Aug 20, 2009 at 7:08 PM, Doug Blank<doug.bl...@gmail.com> wrote:
>> >
>> > On Aug 20, 2:50 pm, Alex Gaynor <alex.gay...@gmail.com> wrote:
>> >> On Thu, Aug 20, 2009 at 1:46 PM, Abiel<abi...@gmail.com> wrote:
>> >>
>> >> > Is there an efficient way to use Django models to load a large number
>> >> > of records into a database without falling back on raw SQL? Creating
>> >> > a
>> >> > large number of model objects and then saving each one individually
>> >> > is
>> >> > very slow (I imagine Django is running INSERT and COMMIT each time?).
>> >>
>> >> > Thanks very much.
>> >>
>> >> Django doesn't currently support any form of bulk insert.  Using raw
>> >> SQL is your best option at this point.
>> >>
>> >
>> > I also am trying to do a bulk load, but has to be written in Python.
>> > It is a two pass loading procedure, where the first pass could be done
>> > with the indexes turned off, and the second pass with them on.
>> >
>> > Is there a way to disable indexes momentarily and then turn them back
>> > on? Or can I manage the transactions to do saves in bulk in the first
>> > pass?
>> >
>> > Thanks for any pointers!
>> >
>> > -Doug
>> >
>> >> Alex
>> >>
>> >> --
>> >> "I disapprove of what you say, but I will defend to the death your
>> >> right to say it." -- Voltaire
>> >> "The people's good is the highest law." -- Cicero
>> >> "Code can always be simpler than you think, but never as simple as you
>> >> want" -- Me
>> > >
>> >
>>
>>
>
>
> >
>

--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com
To unsubscribe from this group, send email to 
django-users+unsubscr...@googlegroups.com
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to