Title:
Hi guys,I meet a problem when try to import data from a
Tab-delimited-text file to a table by command 'copy
from'. A sample procedure is explained as follow,first, I
generate a file called /home/temp/test.txt,
let's suppose it has only two lines, ( the four figures are delimi
"Dong, Meng" <[EMAIL PROTECTED]> writes:
> SGkgZ3V5cywNCg0KSSBtZWV0IGEgcHJvYmxlbSB3aGVuIHRyeSB0byBpbXBv
> cnQgZGF0YSBmcm9tIGEgVGFiLWRlbGltaXRlZC10ZXh0IGZpbGUgdG8gYSB0
> YWJsZSBieSBjb21tYW5kICdjb3B5IGZyb20nLiBBIHNhbXBsZSBwcm9jZWR1
> cmUgaXMgZXhwbGFpbmVkIGFzIGZvbGxvdywNCg0KDQpmaXJzdCwgSSBnZW5l
> cmF
Witaj !
Wyjecha³em na wakacje - odpowiem na Twój
list po powrocie.
W sprawach zwi±zanych z firm± SKY-NET
proszê pisaæ bezpo¶rednio na [EMAIL PROTECTED]
---(end of broadcast)---
TIP 3: if posting/readi
Temp tables don't interact well with plpgsql's attempts to cache query
plans for the queries in its functions. This is a bug, but a solution
is not close at hand.
In 7.1, you can work around this by using EXECUTE to execute the queries
on the temp table, thus forcing a re-plan on every execution
I am running PosgreSQL 7.1 on Redhat 6.2 Kernel 2.4.6.
Under a pretty heavy load:
1000 Transactions per second
32 Open connections
Everything restarts because of too many open files.
I have increase my max number of open files to 16384 but this
just delays the inevitable.
I have
Darin Fisher <[EMAIL PROTECTED]> writes:
> I am running PosgreSQL 7.1 on Redhat 6.2 Kernel 2.4.6.
> Under a pretty heavy load:
> 1000 Transactions per second
> 32 Open connections
> Everything restarts because of too many open files.
> I have increase my max number of open files to 16
Thanks, so far that looks like it is helping.
Only time will tell :)
I take it, that the pg_nofile is the max number of file to open per postgres
session?
Darin
Tom Lane wrote:
> Darin Fisher <[EMAIL PROTECTED]> writes:
> > I am running PosgreSQL 7.1 on Redhat 6.2 Kernel 2.4.6.
> > Under a pret
> I take it, that the pg_nofile is the max number of file to open per postgres
> session?
Right, it's per backend.
regards, tom lane
---(end of broadcast)---
TIP 5: Have you checked our extensive FAQ?
http://www.postgresql
Rahul Gade ([EMAIL PROTECTED]) reports a bug with a severity of 2
The lower the number the more severe it is.
Short Description
query first time works second time fails for one connection
Long Description
In fact i want to execute the code which i hade placed in remark, but since the
orderby, l
>From my /etc/rc.d/rc.local:
# increase RCVBUF to optimize proxy<->backend
echo 131072 > /proc/sys/net/core/rmem_max
# increase maximum opened files
echo 8192 > /proc/sys/fs/file-max
# increase shared memory
echo "1" > /proc/sys/kernel/shmmax
Regards,
Oleg
On W
10 matches
Mail list logo