Hey Vidja,

May I ask why so many database instances are required? In my experience, if
you get to the point where you really need this many, it's usually better to
use a solution like Sphinx Search Engine, then employ your database as a
heavy key/value store (with the bare necessary unique compound keys).

We have a database with around 19 million rows (500,000 now added each day),
which comes to around 20gb in the data lib folder. Now we push around 170
queries per second average (consisting of simple ID lookups), on a single
quad core server with 8gb RAM. All the lookup queries take place inside
Sphinx Search.

This may not be appropriate for what you need, and does add some complexity,
but I would certainly encourage you to explore this option if performance is
what you need, without having to throw more and more hardware into it.

Cal



On Mon, Mar 7, 2011 at 9:41 PM, Vidja <vidja.hun...@gmail.com> wrote:

> Hi all,
>
> does anyone have experience with querying against multiple databases
> and performance issues? My plan is to query against 15-200  instances
> of a database (Postgres on a 8 core 32 GB machine, each db about 200
> mb) , all with the same schema (an thus model file), but with
> different data.
> Are there serious performance bottlenecks to be expected?
>
> --
> You received this message because you are subscribed to the Google Groups
> "Django users" group.
> To post to this group, send email to django-users@googlegroups.com.
> To unsubscribe from this group, send email to
> django-users+unsubscr...@googlegroups.com.
> For more options, visit this group at
> http://groups.google.com/group/django-users?hl=en.
>
>

-- 
You received this message because you are subscribed to the Google Groups 
"Django users" group.
To post to this group, send email to django-users@googlegroups.com.
To unsubscribe from this group, send email to 
django-users+unsubscr...@googlegroups.com.
For more options, visit this group at 
http://groups.google.com/group/django-users?hl=en.

Reply via email to