> I see an estimate for 1000 rows in your EXPLAIN output too, so you're
experiencing the same
> although in your case the estimate of 1000 might be more accurate. The
misestimation was causing
> significant performance problems for me.

> My solution was to wrap generate_series() in a custom function that had a
ROWS qualifier

That's interesting! I actually wasn't familiar with the ROWs feature at
all, so that is good knowledge to pocket.

In my case, I think the number of rows will vary quite a bit for different
time periods/resolutions (and 1000 might not be a bad estimate for some of
the workloads). I do wonder whether if the planner had a sense of how big
the series result could be for longer periods/finer resolutions (which is a
bit of information I could actually trivially generate outside and encode
into the query explicitly if need be), it might avoid/minimize the NESTED
LOOP at all costs, but I'm not sure how to communicate that information.

Anyway, thank you for sharing! Very helpful to hear what other people have
dealt with in similar situations.

Reply via email to