Re: [GENERAL] Best way to handle multi-billion row read-only table?

2010-02-09 Thread Asher Hoskins
Justin Graf wrote: Well first is that 200hz meaning 200 samples per channel per second. That is very fast sampling for pressure sensor, I would be surprised if the meters are actually giving real results at that rate. I would look at reducing that down to what the meter is actual capable

[GENERAL] to_timestamp() and quarters

2010-03-02 Thread Asher Hoskins
Hello. I can't seem to get to_timestamp() or to_date() to work with quarters, can anyone see what I'm doing wrong? e.g. select to_date('2010-1', '-Q'); Gives "2010-01-01" (correct). select to_date('2010-3', '-Q'); Also gives "2010-01-01" (should be 2010

Re: [GENERAL] to_timestamp() and quarters

2010-03-04 Thread Asher Hoskins
A. Kretschmer wrote: In response to Tom Lane : Asher Hoskins writes: I can't seem to get to_timestamp() or to_date() to work with quarters, The source code says * We ignore Q when converting to date because it is not * norm

[GENERAL] Table growing faster than autovacuum can vacuum

2012-02-15 Thread Asher Hoskins
Hello. I've got a database with a very large table (currently holding 23.5 billion rows, the output of various data loggers over the course of my PhD so far). The table itself has a trivial structure (see below) and is partitioned by data time/date and has quite acceptable INSERT/SELECT perfo