I guess I would still recommend that you not get too fancy. Your data
rate is 50 plants, 8000 records each, for a total of 400k records per
day. I have a MySQL table that added 500k records in the last 24 hours,
and it's fine. It's just an ordinary Innodb table, with no special
optimization
One simple solution for my problem is table partitioning. Both MySQL
and PosgreSQL can handle that.
On PGSQL I can create a master table for my model and one partition
for every value of a master key. It should be great.
Do you think it's possible to handle a child table creation (also with
a PGS
On Fri, Jan 23, 2009 at 7:17 AM, Alessandro Ronchi <
alessandro.ron...@soasi.com> wrote:
> 2009/1/23 Ned Batchelder
>
>> You don't need to create new tables like this. These database systems
>> are very good at handling large amounts of data. Add a field plant to your
>> model, make sure it is
2009/1/23 Ned Batchelder
> You don't need to create new tables like this. These database systems are
> very good at handling large amounts of data. Add a field plant to your
> model, make sure it is indexed, and use it to query for the data you want.
> The entire system from the database up th
You don't need to create new tables like this. These database systems
are very good at handling large amounts of data. Add a field plant to
your model, make sure it is indexed, and use it to query for the data
you want. The entire system from the database up through the ORM and
the rest of D
5 matches
Mail list logo