Hi Matteo
> Brokers are never going to look into batches, they will just check at the
batch header, without decompression, and not dig into the individual
messages properties
I agree with this. The PIP-105 doesn't support batch send I think. PIP-105
maybe is not a good case for this changing.
Th
--
Matteo Merli
On Tue, Jul 12, 2022 at 12:54 AM Enrico Olivelli wrote:
>
> Hello,
> I think that we could implement a small but effective enhancement to batching.
>
> It may happen that even if you enable batching you come to create
> entries with 1 only message.
>
> Processing batch messages r
Go it, thanks Enrico
Best,
Xiaoyu Hou
Enrico Olivelli 于2022年7月12日周二 22:40写道:
> Xiaoyu
>
> Il giorno mar 12 lug 2022 alle ore 16:36 Anon Hxy
> ha scritto:
> >
> > Hi Enrico
> >
> > I am interesting in working on this task. Could you please assign the
> work
> > to me? @AnonHxy
>
>
> This is th
Xiaoyu
Il giorno mar 12 lug 2022 alle ore 16:36 Anon Hxy
ha scritto:
>
> Hi Enrico
>
> I am interesting in working on this task. Could you please assign the work
> to me? @AnonHxy
This is the issue
https://github.com/apache/pulsar/issues/16547
I am not sure I can "assign" the issue to you, bu
Hi Enrico
I am interesting in working on this task. Could you please assign the work
to me? @AnonHxy
Thanks,
Xiaoyu Hou
Enrico Olivelli 于2022年7月12日周二 22:28写道:
> Thank you all !
>
> I am going to create a issue/feature request
>
> Is there anyone interested in working on that ?
>
> We need a P
Thank you all !
I am going to create a issue/feature request
Is there anyone interested in working on that ?
We need a PIP.
The implementation probably won't be hard, but it may impact a few
test cases in the repo, because sometimes we turn on batching to
generate messages
and now if you don't w
+1
Good idea
Regards
Jiwei Guo (Tboy)
On Tue, Jul 12, 2022 at 9:07 PM PengHui Li wrote:
> +1
>
> We should start with a proposal, so that we can clear on the
>
> On Tue, Jul 12, 2022 at 5:50 PM ZhangJian He wrote:
>
> > +1
> >
> > Thanks
> > ZhangJian He
> >
> > Qiang Huang 于2022年7月12日周二 17:
+1
We should start with a proposal, so that we can clear on the
On Tue, Jul 12, 2022 at 5:50 PM ZhangJian He wrote:
> +1
>
> Thanks
> ZhangJian He
>
> Qiang Huang 于2022年7月12日周二 17:34写道:
>
> > +1
> > Good idea! It will greatly reduce the resource consumption of small
> > batches.
> >
> > Yubiao
+1
Good idea
Thanks,
Xiaoyu Hou
Enrico Olivelli 于2022年7月12日周二 15:54写道:
> Hello,
> I think that we could implement a small but effective enhancement to
> batching.
>
> It may happen that even if you enable batching you come to create
> entries with 1 only message.
>
> Processing batch messages r
+1
Thanks
ZhangJian He
Qiang Huang 于2022年7月12日周二 17:34写道:
> +1
> Good idea! It will greatly reduce the resource consumption of small
> batches.
>
> Yubiao Feng 于2022年7月12日周二 16:00写道:
>
> > +1
> > Good idea.
> >
> > Thanks
> > Yubiao Feng
> >
> > On Tue, Jul 12, 2022 at 3:54 PM Enrico Olivelli
+1
Good idea! It will greatly reduce the resource consumption of small batches.
Yubiao Feng 于2022年7月12日周二 16:00写道:
> +1
> Good idea.
>
> Thanks
> Yubiao Feng
>
> On Tue, Jul 12, 2022 at 3:54 PM Enrico Olivelli
> wrote:
>
> > Hello,
> > I think that we could implement a small but effective enhan
+1
Good idea.
Thanks
Yubiao Feng
On Tue, Jul 12, 2022 at 3:54 PM Enrico Olivelli wrote:
> Hello,
> I think that we could implement a small but effective enhancement to
> batching.
>
> It may happen that even if you enable batching you come to create
> entries with 1 only message.
>
> Processing
Hello,
I think that we could implement a small but effective enhancement to batching.
It may happen that even if you enable batching you come to create
entries with 1 only message.
Processing batch messages requires a good amount of resources, both on
the broker and on the client side.
Especiall
13 matches
Mail list logo