Re: [GENERAL] Maximum reasonable free space map

2008-12-17 Thread Greg Stark
On Wed, Dec 17, 2008 at 5:45 AM, Scott Marlowe wrote: > > If you've got 40M rows and 10% are updated each day, then it's likely > you'll want 4M fsm entries avaialble for those dead rows. FWIW you only need an entry for each *page* of the table, not every row. Of course if you have 40M rows and 1

Re: [GENERAL] Maximum reasonable free space map

2008-12-17 Thread Grzegorz Jaśkiewicz
On Wed, Dec 17, 2008 at 5:45 AM, Scott Marlowe wrote: > It's all about the size of your tables. If you've got 1 table with > 100k rows that's updated a lot then an fsm of 100k is likely > reasonable, assuming you've got autovac keeping things in check. Got > 4G rows but none are ever updated, th

Re: [GENERAL] Maximum reasonable free space map

2008-12-16 Thread Scott Marlowe
On Tue, Dec 16, 2008 at 5:55 PM, Phillip Berry wrote: > Hi Everyone, > > Just wondering what the maximum reasonable free space map setting should be? > I'm receiving the > following advice from vacuum: > > INFO: free space map contains 170803 pages in 117 relations > DETAIL: A total of 185000

Re: [GENERAL] Maximum reasonable free space map

2008-12-16 Thread Gregory Stark
Phillip Berry writes: > So I guess my question is, is there a point where you start to see > diminishing returns or even negative returns by setting the fsm too high? There is no benefit to having FSM larger than necessary, so I suppose that qualifies as "diminishing returns". The only negative

Re: [GENERAL] Maximum reasonable free space map

2008-12-16 Thread Grzegorz Jaśkiewicz
On Wed, Dec 17, 2008 at 2:29 AM, Phillip Berry wrote: > > The data in nearly every table is constantly changing due to a high volume of > new data constantly > coming in, processing on the existing data and heavy reporting being done all > at once all day and > night. > > So I guess my question

Re: [GENERAL] Maximum reasonable free space map

2008-12-16 Thread Phillip Berry
The data in nearly every table is constantly changing due to a high volume of new data constantly coming in, processing on the existing data and heavy reporting being done all at once all day and night. So I guess my question is, is there a point where you start to see diminishing returns or

Re: [GENERAL] Maximum reasonable free space map

2008-12-16 Thread Grzegorz Jaśkiewicz
On Wed, Dec 17, 2008 at 12:55 AM, Phillip Berry wrote: > > I thought 185K was pretty high, is going to 700K+ reasonable? I've got 16GB > of ram and am running > very high volume 100GB+ DBs. all depends on how often does the data change. I would go with whatever vacuum is suggesting on producti

[GENERAL] Maximum reasonable free space map

2008-12-16 Thread Phillip Berry
Hi Everyone, Just wondering what the maximum reasonable free space map setting should be? I'm receiving the following advice from vacuum: INFO: free space map contains 170803 pages in 117 relations DETAIL: A total of 185000 page slots are in use (including overhead). 733008 page slots are re