On 2018-08-10 22:57:57 +0200, Tomas Vondra wrote: > > > On 08/09/2018 07:47 PM, Alvaro Herrera wrote: > > On 2018-Aug-09, Tomas Vondra wrote: > > > >> I suppose there are reasons why it's done this way, and admittedly the test > >> that happens to trigger this is a bit extreme (essentially running pgbench > >> concurrently with 'vacuum full pg_class' in a loop). I'm not sure it's > >> extreme enough to deem it not an issue, because people using many temporary > >> tables often deal with bloat by doing frequent vacuum full on catalogs. > > > > Actually, it seems to me that ApplyLogicalMappingFile is just leaking > > the file descriptor for no good reason. There's a different > > OpenTransientFile call in ReorderBufferRestoreChanges that is not > > intended to be closed immediately, but the other one seems a plain bug, > > easy enough to fix. > > > > Indeed. Adding a CloseTransientFile to ApplyLogicalMappingFile solves > the issue with hitting maxAllocatedDecs. Barring objections I'll commit > this shortly.
Yea, that's clearly a bug. I've not seen a patch, so I can't quite formally sign off, but it seems fairly obvious. > But while running the tests on this machine, I repeatedly got pgbench > failures like this: > > client 2 aborted in command 0 of script 0; ERROR: could not read block > 3 in file "base/16384/24573": read only 0 of 8192 bytes > > That kinda reminds me the issues we're observing on some buildfarm > machines, I wonder if it's the same thing. Oooh, that's interesting! What's the precise recipe that gets you there? Greetings, Andres Freund