Thank you for answering so fast,
After a good night, I found the solution.
There was a problem with a variable that was name as a field name I
execute in the query.
So everything looks fine now!
Thanks a lot
Le dimanche 31 janvier 2010 à 16:55 -0700, Scott Marlowe a écrit :
> On Sun, Jan 31, 20
select distinct a.* from test a, test b where a.fname = b.fname
and a.lname=b.lname
and a.sn <> b.sn
Regards,
Jayadevan
From: zach cruise
To: pgsql-general@postgresql.org
Date: 01/29/2010 10:09 PM
Subject:[GENERAL] how to look for duplicate rows?
Sent by:pgsql-general-
Hi,
I am prototyping a system which sends all INSERT/UPDATE/DELETE events
to a third party software, I do:
CREATE TABLE data (id Serial PRIMARY KEY, data VARCHAR(255));
CREATE TABLE log (op CHAR(6), id integer, data VARCHAR(255));
CREATE OR REPLACE RULE send_notify AS ON INSERT TO log DO ALSO NOT
I have just installed 8.4 on an XP. My intent is to use it with Trac
and Apache.
I would like to validate the installation of pgsql. What would be a
good method to make sure that pgsql is in there right?
Ray
--
Sent via pgsql-general mailing list (pgsql-general@postgresql.org)
To make changes
On 01/31/2010 03:52 PM, Florent THOMAS wrote:
> Hello everybody,
>
> I'm trying to find out how to have a dynamic crosstab as in excel,
> ireport,etc...
> As i understand of the manual here :
> http://docs.postgresqlfr.org/8.4/tablefunc.html
> I can have multiple columns.
>
> Unfortunately, it se
On Sun, Jan 31, 2010 at 4:53 PM, Florent THOMAS wrote:
> Hy everybody,
>
> I have a problem with 2 triggers.
>
> I work on 3 tables :
> table A ==> with one trigger after insert that insert values in table B
> Table B ==> with one trigger after insert that insert values in table C
> Table C
> As I
Hy everybody,
I have a problem with 2 triggers.
I work on 3 tables :
table A ==> with one trigger after insert that insert values in table B
Table B ==> with one trigger after insert that insert values in table C
Table C
As I insert values on table A, I have a message that indicates the
EXECUTE s
Hello everybody,
I'm trying to find out how to have a dynamic crosstab as in excel,
ireport,etc...
As i understand of the manual here :
http://docs.postgresqlfr.org/8.4/tablefunc.html
I can have multiple columns.
Unfortunately, it seems indispensible to name the columns in the AS
clause.
Am I rig
Thanks. That works nicely.
On Tue, Jan 26, 2010 at 8:00 PM, Greg Smith wrote:
> zhong ming wu wrote:
>>
>> Is there a way to figure out from binaries what options were used to
>> compile/config? For example with apache I can do "httpd -l"
>>
>
> pg_config is what you're looking for.
>
> In some
Mads Lie Jensen writes:
> SELECT pg_catalog.pg_get_constraintdef(r.oid, true) AS condef
> FROM pg_catalog.pg_constraint r,
> pg_catalog.pg_class c
> WHERE c.oid=r.conrelid
> AND r.contype = 'f'
> AND c.relname = 'table_name'
> which gives me the foreign
I'm not sure if this is the best list to ask... I have a need to know if the
server is able to accept connections - is there a way to call
canAcceptConnections() from the front end somehow?
Thanks.
__
Yahoo! Canada Toolbar:
Hi
I have this:
SELECT pg_catalog.pg_get_constraintdef(r.oid, true) AS condef
FROM pg_catalog.pg_constraint r,
pg_catalog.pg_class c
WHERE c.oid=r.conrelid
AND r.contype = 'f'
AND c.relname = 'table_name'
which gives me the foreign keys of a given table
Sebastien Boisvert writes:
> [ COPY fails to dump a 138MB bytea column ]
If you can't switch to a 64-bit build of Postgres, you might need to
think about converting those byteas to large objects. It's expected for
COPY to require memory space equal to several times the width of the row
it's tryi
On Fri, Jan 29, 2010 at 18:34, Greg Sabino Mullane wrote:
>
>>> yet, so that page should be listing 7.4.27. Further, shouldn't we be keeping
>>> even 'unsupported' versions on this page, so (e.g. case of
>>> check_postgres.pl)
>>> clients can check if they have the latest revision, even if the ma
On Jan 31, 2010, at 2:46 AM, Joe Kramer wrote:
> Hi,
>
> I need to store a lot of large files (thousands of 10-100 MB files)
> uploaded through my web application and I find that storing them in
> database as bytea field is not practical for backup purposes.
> My database has full backup perfor
Hi all,
We have an OS X app which integrates postgres as its database backend, and
recently we've have a couple of cases where users haven't been able to perform
a backup of their database. The failure gets reported as a problem in a table
("largedata") where we store large binary objects, wi
On Sun, Jan 31, 2010 at 7:25 AM, Craig Ringer
wrote:
>> However, here lies the problem: I need to use SERIALIZABLE transaction
>> isolation level, and AFAIK it's not possible to make several database
>> connections to share the same exact view of the database.
> I've noticed some talk on -HACKERS
On Sun, Jan 31, 2010 at 4:02 PM, Craig Ringer
wrote:
>> I've found this discussion in -HACKERS:
>> http://osdir.com/ml/pgsql-hackers/2009-11/msg00265.html It seems, it's
>> exactly what I need to do. I might try to contribute a patch.
> Well, if you're able to that'd be absolutely brilliant :-)
I
On 01/31/2010 04:46 AM, Joe Kramer wrote:
Hi,
I need to store a lot of large files (thousands of 10-100 MB files)
uploaded through my web application and I find that storing them in
database as bytea field is not practical for backup purposes.
My database has full backup performed every 12 hour
On 31/01/2010 6:46 PM, Joe Kramer wrote:
Hi,
I need to store a lot of large files (thousands of 10-100 MB files)
uploaded through my web application and I find that storing them in
database as bytea field is not practical for backup purposes.
My database has full backup performed every 12 hours
On 31/01/2010 9:06 PM, Alex Besogonov wrote:
On Sun, Jan 31, 2010 at 7:25 AM, Craig Ringer
wrote:
However, here lies the problem: I need to use SERIALIZABLE transaction
isolation level, and AFAIK it's not possible to make several database
connections to share the same exact view of the databas
Joe Kramer wrote:
Hi,
I need to store a lot of large files (thousands of 10-100 MB files)
uploaded through my web application and I find that storing them in
database as bytea field is not practical for backup purposes.
My database has full backup performed every 12 hours and backup is
encrypte
Hi,
I need to store a lot of large files (thousands of 10-100 MB files)
uploaded through my web application and I find that storing them in
database as bytea field is not practical for backup purposes.
My database has full backup performed every 12 hours and backup is
encrypted and copied to serv
23 matches
Mail list logo