Kevin Grittner wrote:
On Wed, Jan 30, 2008 at  8:13 PM, in message
<[EMAIL PROTECTED]>, "Christopher
Browne" <[EMAIL PROTECTED]> wrote:
There seems to be *plenty* of evidence out there that the performance
penalty would NOT be "essentially zero."
I can confirm that I have had performance tank because of boosting
the statistics target for selected columns.  It appeared to be time
spent in the planning phase, not a bad plan choice.  Reducing the
numbers restored decent performance.

One idea I've been thinking about is to add a step after the analyze, to look at the statistics that was gathered. If it looks like the the distribution is pretty flat, reduce the data to a smaller set before storing it in pg_statistic.

You would still get the hit of longer ANALYZE time, but at least you would avoid the hit on query performance where the higher statistics are not helpful. We could also print an INFO line along the lines of "you might as well lower the statistics target for this table, because it's not helping".

No, I don't know how to determine when you could reduce the data, or how to reduce it...

--
  Heikki Linnakangas
  EnterpriseDB   http://www.enterprisedb.com

---------------------------(end of broadcast)---------------------------
TIP 1: if posting/reading through Usenet, please send an appropriate
      subscribe-nomail command to [EMAIL PROTECTED] so that your
      message can get through to the mailing list cleanly

Reply via email to