Hello,

I have an application which looks lke this

Star: id, etc
Band:id, etc
LightCurve: id, star fkey(Star), band fkey(Band)

Postgres is the backend..8.2.3

Every Star basically has multiple light curves (brightness vs time
plots) in different spectral bands.

Now, there are about 1/2 a billion stars i want to add. Having the
foreign keys really slows things down, even when i set transactions to
happen with the manual transactions decorator every 3000 stars or so.

The timing goes thus:
first 15623   6mins 21:31
next 21681   14 mins 21:45
next 29262   31 mins 22.16
next 36158   57 mins 23.13

which seems to indicate that as more stars get created, fkey lookups
are taking longer and longer.

Also, django generates additional indexes such as:
"maindb_lightcurve_star_id" btree (star_id), which i presume is done
to allow for extremely
quick 'backwords' queries such as starinstance.lcot_set.all()

My questions are
(a) these additional indexes..do they affect performance on the
inserts...it would seem so..
(b) i can do fkeys later by having explicit integers for the ids, then
changing the model. Identically i can do the
indexes later too. But syncdb wont pick this up, right? However, with
a changed model, will everthing just work?

Any other ideas for large scale insert efficiency?



--~--~---------~--~----~------------~-------~--~----~
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to [email protected]
To unsubscribe from this group, send email to [EMAIL PROTECTED]
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en
-~----------~----~----~----~------~----~------~--~---

Reply via email to