Hello,
I am setting up a proof of concept database to store some historical data.
Whilst I've used PostgreSQL a bit in the past this is the first time I've
looked into disk usage due to the amount of data that could potentially be
stored. I've done a quick test and I'm a little confused as to why
Hi all,
I'm working on a problem at the moment where I have some data that I
need to get from a proprietary system into a web page. I was thinking
of using PostgreSQL as a middle man to store the data. E.g
- C++ app reads data from proprietary system and writes it into temp
table in PostgreSQL
Arndt,
Your website says rubyrep runs on Linux and Windows - am I going to have
difficulties if I want to try it on Solaris 10?
Andrew
2009/6/23 Arndt Lehmann
> On Jun 16, 7:48 pm, nishkars...@rediffmail.com (Nishkarsh) wrote:
> > Hi Merlin, thanks for the detailed input.
> >
> > As per ur sug
2009/6/2 björn lundin
>
> > CREATE TABLE "DataImport"
> > (
> > "DataImportID" serial NOT NULL PRIMARY KEY,
> > "Time" timestamp without time zone NOT NULL,
> > "ID_ABC" integer NOT NULL,
> > "ID_DEF" integer NOT NULL,
> > "ID_HIJ" integer NOT NULL,
> > etc
> > );
>
> Perhaps you want
On Mon, Jun 1, 2009 at 1:25 AM, Tom Lane wrote:
> Andrew Smith writes:
> > I'm a beginner when it comes to Postgresql, and have a table design
> question
> > about a project I'm currently working on. I have 1500 data items that
> need
> > to be copied ever
Hi all,
I'm a beginner when it comes to Postgresql, and have a table design question
about a project I'm currently working on. I have 1500 data items that need
to be copied every minute from an external system into my database. The
items have a timestamp, an identifier and a value. For example: