Hi all,
I am not sure if this should be sent to this mailing list. If i am wrong
could someone please direct me to the correct one so I can subscribe
there.
I wanted to ask a simple question. Say I have a table with the timestamp
field. What is the best way to say get all the records that were cr
Hi Kevin,
On Tue, 3 Feb 2004, Kevin Brown wrote:
> Slavisa Garic wrote:
> > Using pg module in python I am trying to run the COPY command to populate
> > the large table. I am using this to replace the INSERT which takes about
> > few hours to add 7 entries where copy ta
Hi,
I have a question about the COPY statement. I am using PGSQL(7.3.4) with
python-2.3 on RedHat v8 machine. The problem I have is the following.
Using pg module in python I am trying to run the COPY command to populate
the large table. I am using this to replace the INSERT which takes about
fe
Similar
behaviour is observed but it just takes a bit less time to insert (0.01
less then usually at 6 records)
Regards,
Slavisa
On Fri, 14 Nov 2003, Dann Corbit
wrote:
> > -Original Message-
> > From: Slavisa Garic [mailto:[EMAIL PROTECTED]
> > Sent: Thursday, November
On Fri, 14 Nov 2003, Alvaro Herrera wrote:
> On Fri, Nov 14, 2003 at 06:36:41PM +1100, Slavisa Garic wrote:
>
> > Rows PresentStart Time Finish Time
> >
> > 100
Hi Everyone,
This is my first post here so please tell me to go somewhere else if this
is the wrong place to post questions like this.
I am using PostgreSQL 7.3.2 and have used earlier versions (7.1.x onwards)
and with all of them I noticed same problem with INSERTs when there is a
large data set
Hi Everyone,
This is my first post here so please tell me to go somewhere else if this
is the wrong place to post questions like this.
I am using PostgreSQL 7.3.2 and have used earlier versions (7.1.x onwards)
and with all of them I noticed same problem with INSERTs when there is a
large data