On 03/09/07, Scott Marlowe <[EMAIL PROTECTED]> wrote:
>
> On 9/3/07, Rob Kirkbride <[EMAIL PROTECTED]> wrote:
> > Hi,
> >
> > I've got a postgres database collected logged data. This data I have to
> keep
> > for at least 3 years. The data in the first instance is being recorded
> in a
> > postgres cluster. This then needs to be moved a reports database server
> for
> > analysis. Therefore I'd like a job to dump data on the cluster say every
> > hour and record this is in the reports database. The clustered database
> > could be purged of say data more than a week old.
> >
> > So basically I need a dump/restore that only appends new data to the
> reports
> > server database.
> >
> > I've googled but can't find anything, can anyone help?
>
> You might find an answer in partitioning your data.  There's a section
> in the docs on it.  Then you can just dump the old data from the
> newest couple of partitions if you're partitioning by week, and dump
> anything older with a simple delete where date < now() - interval '1
> week' or something like that.



We're using hibernate to write to the database. Partitioning looks like it
will be too much of a re-architecture. In reply to Andrej we do have a
logged_time entity in the required tables. That being the case how does that
help me with the tools provided?

Might I have to write a custom JDBC application to do the data migration?

Rob

Reply via email to