Muruganantham M wrote:
Hi,
We are developing one website using ChessD in red hat environment.
ChessD is
a open source which requires Postgres as its back end. When we tried to
install ChessD we got the error
Missing postgresql/libpq-fe.h, is libpq-dev installed?.
Have you tried looking for
Hi,
We are developing one website using ChessD in red hat environment. ChessD is
a open source which requires Postgres as its back end. When we tried to
install ChessD we got the error
Missing postgresql/libpq-fe.h, is libpq-dev installed?.
But this chess D is works fine in Debian and We need a
On Sun, Mar 12, 2006 at 11:46:25 -,
Phadnis <[EMAIL PROTECTED]> wrote:
>
> 1 ) when i try to query for count or for any thg it takes a long time to
> return the result. How to avoid this
Postgres doesn't cache counts, so if you are counting a lot of records, this
may take a while to run.
On 12 Mar 2006 11:46:25 -, Phadnis <[EMAIL PROTECTED]> wrote:
> Hi.
>
> I am new to postgres and i need help from u.i hope i get positive response..
> though my questions mite seem silly to u...
>
> iam working on postgres.. i have around 1 lakh records in almost 12 tables..
> 1 ) when i t
Hi.
I am new to postgres and i need help from u.i hope i get positive response.. though my questions mite seem silly to u...
iam working on postgres.. i have around 1 lakh records in almost 12 tables..
1 ) when i try to query for count or for any thg it takes a long time to return the result.
Hi Josh,
Can you tell me in what way it affects performance? And How do I decide what value to set for the random_page_cost? Does it depend on any other factors?
Thanks,
SaranyaJosh Berkus <[EMAIL PROTECTED]> wrote:
Sarlav,> I am sorry, I am not aware of what random_page_cost is, as I am new t
Sarlav,
> I am sorry, I am not aware of what random_page_cost is, as I am new to
> Postgres. What does it signify and how do I reduce random_page_cost?
It's a parameter in your postgresql.conf file.After you test it, you will
want to change it there and reload the server (pg_ctl reload).
Ho
From: "sarlav kumar" <[EMAIL PROTECTED]>
> [Tom:]
> >You might get some results from increasing the
> >statistics target for merchant_purchase.merchant_id.
>
> Do I have to use vacuum analyze to update the statistics? If so, I have
already tried that and it doesn't seem to help.
alter table mer
Hi Tom,
Thanks for the help, Tom.
>The major issue seems to be in the sub-selects:> -> Seq Scan on merchant_purchase mp (cost=0.00..95.39 rows=44 width=4) (actual time=2.37..2.58 rows=6 loops=619)> Filter: (merchant_id = $0)>where the estimated row count is a factor of 7 too high. If the>estima
sarlav kumar <[EMAIL PROTECTED]> writes:
> I have a query which does not use index scan unless I force postgres to use
> index scan. I dont want to force postgres, unless there is no way of
> optimizing this query.
The major issue seems to be in the sub-selects:
> -> Seq Scan
Hi All,
I am new to Postgres.
I have a query which does not use index scan unless I force postgres to use index scan. I dont want to force postgres, unless there is no way of optimizing this query.
The query :
select m.company_name,m.approved,cu.account_no,mbt.business_name,cda.country,
(s
11 matches
Mail list logo