On Wed, Nov 02, 2016 at 07:48:23PM -0400, Tom Lane wrote:
> There's not a lot of point in worrying about your two-row table when these
> other estimates are off by multiple orders of magnitude. In this
> particular case my first bet would be that the planner has no idea about
> the selectivity of
Justin Pryzby writes:
>> With only two rows in the table, I'm not real sure why you'd need an MCV
>> list. Could we see the actual problem query (and the other table
>> schemas), rather than diving into the code first?
> Sigh, yes, but understand that it's a legacy report which happens to curren
On Wed, Nov 02, 2016 at 04:05:46PM -0400, Tom Lane wrote:
> Justin Pryzby writes:
> > I believe the join is being (badly) underestimated, leading to a crappy plan
> > involving multiple nested loop joins, which takes 2.5 hours instead of a
> > handful of seconds; I believe that might be resolved b
Justin Pryzby writes:
> I believe the join is being (badly) underestimated, leading to a crappy plan
> involving multiple nested loop joins, which takes 2.5 hours instead of a
> handful of seconds; I believe that might be resolved by populating its MCV
> list..
With only two rows in the table, I'
We have a report query which joins (multiple times, actually) against this
trivial, tiny table:
ts=# \d bsm_to_switch
Table "public.bsm_to_switch"
Column | Type | Modifiers
+--+---
bsm| text | not null
switch | text | not null
ts=# SELECT length(bsm), length(switch) FR