Re: Using per-transaction memory contexts for storing decoded tuples

2024-10-16 Thread Masahiko Sawada
On Wed, Oct 16, 2024 at 10:32 AM Masahiko Sawada wrote: > > On Tue, Oct 15, 2024 at 9:01 PM Amit Kapila wrote: > > > > On Tue, Oct 15, 2024 at 11:15 PM Masahiko Sawada > > wrote: > > > > > > On Sun, Oct 13, 2024 at 11:00 PM Amit Kapila > > > wrote: > > > > > > > > On Fri, Oct 11, 2024 at 3:40

Re: Using per-transaction memory contexts for storing decoded tuples

2024-10-16 Thread Masahiko Sawada
On Tue, Oct 15, 2024 at 9:01 PM Amit Kapila wrote: > > On Tue, Oct 15, 2024 at 11:15 PM Masahiko Sawada > wrote: > > > > On Sun, Oct 13, 2024 at 11:00 PM Amit Kapila > > wrote: > > > > > > On Fri, Oct 11, 2024 at 3:40 AM Masahiko Sawada > > > wrote: > > > > > > > > Please find the attached p

Re: Using per-transaction memory contexts for storing decoded tuples

2024-10-15 Thread Amit Kapila
On Tue, Oct 15, 2024 at 11:15 PM Masahiko Sawada wrote: > > On Sun, Oct 13, 2024 at 11:00 PM Amit Kapila wrote: > > > > On Fri, Oct 11, 2024 at 3:40 AM Masahiko Sawada > > wrote: > > > > > > Please find the attached patches. > > > > > Thank you for reviewing the patch! > > > > > @@ -343,9 +343,

Re: Using per-transaction memory contexts for storing decoded tuples

2024-10-15 Thread Masahiko Sawada
On Sun, Oct 13, 2024 at 11:00 PM Amit Kapila wrote: > > On Fri, Oct 11, 2024 at 3:40 AM Masahiko Sawada wrote: > > > > Please find the attached patches. > > Thank you for reviewing the patch! > > @@ -343,9 +343,9 @@ ReorderBufferAllocate(void) > */ > buffer->tup_context = GenerationContextC

Re: Using per-transaction memory contexts for storing decoded tuples

2024-10-13 Thread Amit Kapila
On Fri, Oct 11, 2024 at 3:40 AM Masahiko Sawada wrote: > > Please find the attached patches. > @@ -343,9 +343,9 @@ ReorderBufferAllocate(void) */ buffer->tup_context = GenerationContextCreate(new_ctx, "Tuples", - SLAB_LARGE_BLOCK_SIZE, - SLAB_LARGE_BLOCK_SIZE, - SLAB_LARGE_BLOCK_SIZ

Re: Using per-transaction memory contexts for storing decoded tuples

2024-10-10 Thread Masahiko Sawada
On Thu, Oct 10, 2024 at 8:26 AM Masahiko Sawada wrote: > > On Thu, Oct 10, 2024 at 8:04 AM Fujii Masao > wrote: > > > > > > > > On 2024/10/04 3:32, Masahiko Sawada wrote: > > > Yes, but as for this macro specifically, I thought that it might be > > > better to keep it, since it avoids breaking e

Re: Using per-transaction memory contexts for storing decoded tuples

2024-10-10 Thread Masahiko Sawada
On Thu, Oct 10, 2024 at 8:04 AM Fujii Masao wrote: > > > > On 2024/10/04 3:32, Masahiko Sawada wrote: > > Yes, but as for this macro specifically, I thought that it might be > > better to keep it, since it avoids breaking extension unnecessarily > > and it seems to be natural to have it as an opti

Re: Using per-transaction memory contexts for storing decoded tuples

2024-10-10 Thread Fujii Masao
On 2024/10/04 3:32, Masahiko Sawada wrote: Yes, but as for this macro specifically, I thought that it might be better to keep it, since it avoids breaking extension unnecessarily and it seems to be natural to have it as an option for slab context. If the macro has value, I'm okay with leavin

Re: Using per-transaction memory contexts for storing decoded tuples

2024-10-03 Thread Masahiko Sawada
On Thu, Oct 3, 2024 at 2:46 AM Fujii Masao wrote: > > > > On 2024/10/03 13:47, Masahiko Sawada wrote: > >>> I agree that the overhead will be much less visible in real workloads. > >>> +1 to use a smaller block (i.e. 8kB). > > +1 > > > >>> It's easy to backpatch to old > >>> branches (if we agree)

Re: Using per-transaction memory contexts for storing decoded tuples

2024-10-03 Thread Fujii Masao
On 2024/10/03 13:47, Masahiko Sawada wrote: I agree that the overhead will be much less visible in real workloads. +1 to use a smaller block (i.e. 8kB). +1 It's easy to backpatch to old branches (if we agree) +1 It seems that only reorderbuffer.c uses the LARGE macro so that it can b

Re: Using per-transaction memory contexts for storing decoded tuples

2024-10-02 Thread Masahiko Sawada
On Wed, Oct 2, 2024 at 9:42 PM Hayato Kuroda (Fujitsu) wrote: > > Dear Sawada-san, Amit, > > > > So, decoding a large transaction with many smaller allocations can > > > have ~2.2% overhead with a smaller block size (say 8Kb vs 8MB). In > > > real workloads, we will have fewer such large transacti

RE: Using per-transaction memory contexts for storing decoded tuples

2024-10-02 Thread Hayato Kuroda (Fujitsu)
Dear Sawada-san, Amit, > > So, decoding a large transaction with many smaller allocations can > > have ~2.2% overhead with a smaller block size (say 8Kb vs 8MB). In > > real workloads, we will have fewer such large transactions or a mix of > > small and large transactions. That will make the overh

Re: Using per-transaction memory contexts for storing decoded tuples

2024-10-01 Thread Masahiko Sawada
On Tue, Oct 1, 2024 at 5:15 AM Amit Kapila wrote: > > On Fri, Sep 27, 2024 at 10:24 PM Masahiko Sawada > wrote: > > > > On Fri, Sep 27, 2024 at 12:39 AM Shlok Kyal > > wrote: > > > > > > On Mon, 23 Sept 2024 at 09:59, Amit Kapila > > > wrote: > > > > > > > > On Sun, Sep 22, 2024 at 11:27 AM

Re: Using per-transaction memory contexts for storing decoded tuples

2024-10-01 Thread Amit Kapila
On Fri, Sep 27, 2024 at 10:24 PM Masahiko Sawada wrote: > > On Fri, Sep 27, 2024 at 12:39 AM Shlok Kyal wrote: > > > > On Mon, 23 Sept 2024 at 09:59, Amit Kapila wrote: > > > > > > On Sun, Sep 22, 2024 at 11:27 AM David Rowley > > > wrote: > > > > > > > > On Fri, 20 Sept 2024 at 17:46, Amit Ka

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-27 Thread Masahiko Sawada
On Fri, Sep 27, 2024 at 12:39 AM Shlok Kyal wrote: > > On Mon, 23 Sept 2024 at 09:59, Amit Kapila wrote: > > > > On Sun, Sep 22, 2024 at 11:27 AM David Rowley wrote: > > > > > > On Fri, 20 Sept 2024 at 17:46, Amit Kapila > > > wrote: > > > > > > > > On Fri, Sep 20, 2024 at 5:13 AM David Rowley

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-27 Thread Shlok Kyal
On Mon, 23 Sept 2024 at 09:59, Amit Kapila wrote: > > On Sun, Sep 22, 2024 at 11:27 AM David Rowley wrote: > > > > On Fri, 20 Sept 2024 at 17:46, Amit Kapila wrote: > > > > > > On Fri, Sep 20, 2024 at 5:13 AM David Rowley wrote: > > > > In general, it's a bit annoying to have to code around thi

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-25 Thread Masahiko Sawada
On Sun, Sep 22, 2024 at 9:29 PM Amit Kapila wrote: > > On Sun, Sep 22, 2024 at 11:27 AM David Rowley wrote: > > > > On Fri, 20 Sept 2024 at 17:46, Amit Kapila wrote: > > > > > > On Fri, Sep 20, 2024 at 5:13 AM David Rowley wrote: > > > > In general, it's a bit annoying to have to code around th

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-23 Thread Masahiko Sawada
On Thu, Sep 19, 2024 at 10:44 PM Amit Kapila wrote: > > On Thu, Sep 19, 2024 at 10:33 PM Masahiko Sawada > wrote: > > > > On Wed, Sep 18, 2024 at 8:55 PM Amit Kapila wrote: > > > > > > On Thu, Sep 19, 2024 at 6:46 AM David Rowley wrote: > > > > > > > > On Thu, 19 Sept 2024 at 11:54, Masahiko S

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-22 Thread Amit Kapila
On Fri, Sep 20, 2024 at 10:53 PM Masahiko Sawada wrote: > > On Thu, Sep 19, 2024 at 10:46 PM Amit Kapila wrote: > > > > On Fri, Sep 20, 2024 at 5:13 AM David Rowley wrote: > > > > > > On Fri, 20 Sept 2024 at 05:03, Masahiko Sawada > > > wrote: > > > > I've done other benchmarking tests while c

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-22 Thread Amit Kapila
On Sun, Sep 22, 2024 at 11:27 AM David Rowley wrote: > > On Fri, 20 Sept 2024 at 17:46, Amit Kapila wrote: > > > > On Fri, Sep 20, 2024 at 5:13 AM David Rowley wrote: > > > In general, it's a bit annoying to have to code around this > > > GenerationContext fragmentation issue. > > > > Right, and

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-21 Thread David Rowley
On Fri, 20 Sept 2024 at 17:46, Amit Kapila wrote: > > On Fri, Sep 20, 2024 at 5:13 AM David Rowley wrote: > > In general, it's a bit annoying to have to code around this > > GenerationContext fragmentation issue. > > Right, and I am also slightly afraid that this may not cause some > regression i

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-20 Thread Masahiko Sawada
On Thu, Sep 19, 2024 at 10:46 PM Amit Kapila wrote: > > On Fri, Sep 20, 2024 at 5:13 AM David Rowley wrote: > > > > On Fri, 20 Sept 2024 at 05:03, Masahiko Sawada > > wrote: > > > I've done other benchmarking tests while changing the memory block > > > sizes from 8kB to 8MB. I measured the exec

RE: Using per-transaction memory contexts for storing decoded tuples

2024-09-20 Thread Hayato Kuroda (Fujitsu)
Dear Sawada-san, > Thank you for your interest in this patch. I've just shared some > benchmark results (with a patch) that could be different depending on > the environment[1]. I would be appreciated if you also do similar > tests and share the results. Okay, I did similar tests, the attached sc

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-19 Thread Amit Kapila
On Fri, Sep 20, 2024 at 5:13 AM David Rowley wrote: > > On Fri, 20 Sept 2024 at 05:03, Masahiko Sawada wrote: > > I've done other benchmarking tests while changing the memory block > > sizes from 8kB to 8MB. I measured the execution time of logical > > decoding of one transaction that inserted 10

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-19 Thread Amit Kapila
On Thu, Sep 19, 2024 at 10:33 PM Masahiko Sawada wrote: > > On Wed, Sep 18, 2024 at 8:55 PM Amit Kapila wrote: > > > > On Thu, Sep 19, 2024 at 6:46 AM David Rowley wrote: > > > > > > On Thu, 19 Sept 2024 at 11:54, Masahiko Sawada > > > wrote: > > > > I've done some benchmark tests for three di

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-19 Thread David Rowley
On Fri, 20 Sept 2024 at 05:03, Masahiko Sawada wrote: > I've done other benchmarking tests while changing the memory block > sizes from 8kB to 8MB. I measured the execution time of logical > decoding of one transaction that inserted 10M rows. I set > logical_decoding_work_mem large enough to avoid

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-19 Thread Masahiko Sawada
Hi, On Mon, Sep 16, 2024 at 10:56 PM Hayato Kuroda (Fujitsu) wrote: > > Hi, > > > We have several reports that logical decoding uses memory much more > > than logical_decoding_work_mem[1][2][3]. For instance in one of the > > reports[1], even though users set logical_decoding_work_mem to > > '256

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-19 Thread Masahiko Sawada
On Wed, Sep 18, 2024 at 8:55 PM Amit Kapila wrote: > > On Thu, Sep 19, 2024 at 6:46 AM David Rowley wrote: > > > > On Thu, 19 Sept 2024 at 11:54, Masahiko Sawada > > wrote: > > > I've done some benchmark tests for three different code bases with > > > different test cases. In short, reducing th

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-18 Thread Amit Kapila
On Thu, Sep 19, 2024 at 6:46 AM David Rowley wrote: > > On Thu, 19 Sept 2024 at 11:54, Masahiko Sawada wrote: > > I've done some benchmark tests for three different code bases with > > different test cases. In short, reducing the generation memory context > > block size to 8kB seems to be promisi

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-18 Thread Fujii Masao
On 2024/09/19 8:53, Masahiko Sawada wrote: On Tue, Sep 17, 2024 at 2:06 AM Amit Kapila wrote: On Mon, Sep 16, 2024 at 10:43 PM Masahiko Sawada wrote: On Fri, Sep 13, 2024 at 3:58 AM Amit Kapila wrote: Can we try reducing the size of 8MB memory blocks? The comment atop allocation says:

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-18 Thread David Rowley
On Thu, 19 Sept 2024 at 11:54, Masahiko Sawada wrote: > I've done some benchmark tests for three different code bases with > different test cases. In short, reducing the generation memory context > block size to 8kB seems to be promising; it mitigates the problem > while keeping a similar performa

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-18 Thread Masahiko Sawada
On Tue, Sep 17, 2024 at 2:06 AM Amit Kapila wrote: > > On Mon, Sep 16, 2024 at 10:43 PM Masahiko Sawada > wrote: > > > > On Fri, Sep 13, 2024 at 3:58 AM Amit Kapila wrote: > > > > > > Can we try reducing the size of > > > 8MB memory blocks? The comment atop allocation says: "XXX the > > > alloc

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-17 Thread Masahiko Sawada
On Tue, Sep 17, 2024 at 2:06 AM Amit Kapila wrote: > > On Mon, Sep 16, 2024 at 10:43 PM Masahiko Sawada > wrote: > > > > On Fri, Sep 13, 2024 at 3:58 AM Amit Kapila wrote: > > > > > > On Thu, Sep 12, 2024 at 4:03 AM Masahiko Sawada > > > wrote: > > > > > > > > We have several reports that log

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-17 Thread Amit Kapila
On Mon, Sep 16, 2024 at 10:43 PM Masahiko Sawada wrote: > > On Fri, Sep 13, 2024 at 3:58 AM Amit Kapila wrote: > > > > On Thu, Sep 12, 2024 at 4:03 AM Masahiko Sawada > > wrote: > > > > > > We have several reports that logical decoding uses memory much more > > > than logical_decoding_work_mem[

RE: Using per-transaction memory contexts for storing decoded tuples

2024-09-16 Thread Hayato Kuroda (Fujitsu)
Hi, > We have several reports that logical decoding uses memory much more > than logical_decoding_work_mem[1][2][3]. For instance in one of the > reports[1], even though users set logical_decoding_work_mem to > '256MB', a walsender process was killed by OOM because of using more > than 4GB memory.

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-16 Thread Masahiko Sawada
On Fri, Sep 13, 2024 at 3:58 AM Amit Kapila wrote: > > On Thu, Sep 12, 2024 at 4:03 AM Masahiko Sawada wrote: > > > > We have several reports that logical decoding uses memory much more > > than logical_decoding_work_mem[1][2][3]. For instance in one of the > > reports[1], even though users set l

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-13 Thread Amit Kapila
On Thu, Sep 12, 2024 at 4:03 AM Masahiko Sawada wrote: > > We have several reports that logical decoding uses memory much more > than logical_decoding_work_mem[1][2][3]. For instance in one of the > reports[1], even though users set logical_decoding_work_mem to > '256MB', a walsender process was k

Re: Using per-transaction memory contexts for storing decoded tuples

2024-09-12 Thread torikoshia
On 2024-09-12 07:32, Masahiko Sawada wrote: Thanks a lot for working on this! Hi all, We have several reports that logical decoding uses memory much more than logical_decoding_work_mem[1][2][3]. For instance in one of the reports[1], even though users set logical_decoding_work_mem to '256MB',