Hi,
I'm using Pg for bioinformatic work and I want to be able to insert,
uniquely, biological sequences into a table returning the sequence id -
this part is fine. However, if the sequence already exists in the table
I want to return to id.
At the moment it seems to me that I should do a
SELECT
Thanks for that - yes very helpful. Good to know what is possible.
Dan
On Tue, 2010-11-23 at 10:27 +0100, Matthieu Huin wrote:
> A similar question was discussed here about 3 weeks ago :
> http://archives.postgresql.org/pgsql-general/2010-11/msg00110.php
>
> The "UPSERT" facility not being impl
Hi,
I'm using the perl DBI module to interface with Pg, generating a number
of tables and then loading them into a postgres database (this is to
automate a previously psql-based setup).
One instance of loading the data looks like this, but I am only able to
do this as a superuser (this is possibl
Thanks for that.
On Mon, 2009-10-12 at 20:21 -0400, Stephen Frost wrote:
> * Dan Kortschak (dan.kortsc...@adelaide.edu.au) wrote:
> > $dbh->do("COPY chromosome_data FROM '".chromosomes(\%options)."' CSV");
>
> > Does anyone have any suggestions
Thanks again.
On Mon, 2009-10-12 at 21:14 -0400, Stephen Frost wrote:
> > Seems like the way to go, though it will be significantly slower
> than
> > psql or superuser reads (a couple of tables have ~10s-100sM rows).
>
> Erm, really? You've tested that and found it to be that much slower?
Sorry
Hi, this is a bit of a noob question.
I am using PGSql to perform some large analyses, with the clients being
a sequentially run set of perl scripts (one to set up and populate
tables and then down stream scripts to query the database for the
results).
During manual testing everything works, but
Yes, they are separate perl files (I'm thinking that perhaps this wasn't
the best way to do it now, but for the moment I'm going to have to stick
with it).
In the case of the manual testing it's jus a matter of command line
calls. The automated runs call each script as part of a PBS torque
script
Thanks for that, that should help me sort it out. I haven't used the
autocommit option in pgdbi. I'll have a look to see if DBI::do has an
option to wait for command completion.
cheers
On Mon, 2009-12-07 at 16:12 -0500, Tom Lane wrote:
> It's not. What you want is to COMMIT and make sure you've
Thanks to everyone who has answered this. The short answer is that
torque is not behaving the way I expected and not the way I have ever
seen it behave in the past. The I/O binding of these jobs may have
something to do with this, but I will look into it further.
cheers
On Mon, 2009-12-07 at 13:2
I've spoken to people on the torque user mailing list and tried merlin's
suggestion below (which looked like it should work - but unfortunately
did not prevent the problem).
>From working through things with the torque list, it seems to be the
case that postgresql is behaving differently because i
Thanks Tom,
That was my initial view and I'm still not convinced that I'm wrong - I
can see no way that the server can tell that the query came from a
process without a terminal (though I may be missing something here).
Unfortunately I'm working at the boundary of my knowledge for both
torque and
Thanks for that clarification Merlin,
The server/client is on a workstation that is essentially private (I
share some time with students, but they don't have pg access). The locks
are across sessions. There are three perl scripts that connect to a pg
db, one loads the database and creates some com
12 matches
Mail list logo