On Tue, Oct 11, 2011 at 5:45 AM, Greg Stark <st...@mit.edu> wrote: > On Mon, Oct 10, 2011 at 9:17 PM, Tom Lane <t...@sss.pgh.pa.us> wrote: >> My intention was to allow it to consider any covering index. You're >> thinking about the cost estimate, which is really entirely different. >> > > Is there any reason to consider more than one? I would have expected > the narrowest one to be the best choice. There's something to be said > for using the same index consistently but we already have that problem > and make no attempt to do that. And partial indexes might be better > but then we would already be considering them if their constraints are > satisfied.
You raise a fantastic idea. Use the frequency of use as a factor of an index in the cost of optimising a query. We have previously discussed the idea of using the RAM residency of an index to control the cost. That is difficult to judge. Using the long term prevalence of usage as a weighting factor makes a great deal of sense for queries that could potentially utilise multiple indexes. That information is readily available and directly applicable. The prevalence of use directly drives RAM residency, so it makes sense to use the causal factor as input to the cost. -- Simon Riggs http://www.2ndQuadrant.com/ PostgreSQL Development, 24x7 Support, Training & Services -- Sent via pgsql-hackers mailing list (pgsql-hackers@postgresql.org) To make changes to your subscription: http://www.postgresql.org/mailpref/pgsql-hackers