> > Because we do not want the dba to decide which statistics are optimal,
> > there should probably be an analyze helper application that is invoked
> > with "vacuum analyze database optimal" or some such, that also decides
> > whether a table was sufficiently altered to justify new stats gathe
> > 3. if at all, an automatic analyze should do the samples on small tables,
> > and accurate stats on large tables
>
> Other way 'round, surely? It already does that: if your table has fewer
> rows than the sampling target, they all get used.
I mean, that it is probably not useful to maintai
Zeugswetter Andreas SB <[EMAIL PROTECTED]> writes:
> Imho that is not optimal :-) ** ducks head, to evade flying hammer **
> 1. the random sample approach should be explicitly requested with some
> syntax extension
I don't think so ... with the current implementation you *must* do
approximate A