On Thu, Jan 14, 2021 at 9:29 AM Magnus Hagander <mag...@hagander.net> wrote: > Do you have any actual metrics between specifically choosing the value > 3? Or is that off a gut feeling?
I have no metrics, exactly, but I'm sure that the trend I mentioned about page cleaning/dirtying being the bottleneck more and more these days is true. This trend is very apparent to all of this, it seems, so I am sure that I basically have the right idea here. I'm a little concerned that it should actually be lowered to 2. With that said, I don't actually accept what seems to be the original premise of these GUCs, so I am not interested in using that to justify changing the vacuum_cost_page_miss default. The premise seems to be: VACUUM's behavior is determined by treating it as an optimization problem, so all you as the DBA need to do is characterize the cost of each kind of elementary operation using the GUCs -- the dynamic algorithm will do the rest. What algorithm might that be, though? This is not the optimizer, and there is no scope to come up with a cheaper plan for VACUUM. Why not throttle longer running queries instead, or as well? More on the first principles of the costing stuff in a bit, when I respond to Robert... -- Peter Geoghegan