Thanks, Dmitry.

It seems your solution is the the most appropriate for me, because the
reasons why consumers (POS terminals) will be added/removed are different
from the reasons why partitions will be added/removed.

I think that topic division to partitions and POS terminals division to
logical groups (each POS terminal from one group consumes one partition)
must be independent from each other. Creating specific partition for
specific POS terminal is a pitfall.


пн, 1 апр. 2019 г., 12:26 Dimitry Lvovsky <dlvov...@gmail.com>:

> Going off of what Hans mentioned, I don't see any reason for 200,000
> partitions...you don't need one partition per POS. You can have all of your
> pos listening to one partition and each pos agent having a unique group
> id.  The POS agent only processes the messages that are relevant to him,
> and simply ignores the rest.  If you have bandwidth or processing power
> concerns could also consider subdividing topics ( not partitions ) per
> regions.
>
> Dimitry
>
> On Mon, Apr 1, 2019 at 8:16 AM Hans Jespersen <h...@confluent.io> wrote:
>
> > Yes but you have more than 1 POS terminal per location so you still don't
> > need 20,000 partitions. Just one per location. How many locations do you
> > have?
> >
> > In doesn’t matter anyway since you can build a Kafka cluster with up to
> > 200,000 partitions if you use the latest versions of Kafka.
> >
> >
> https://blogs.apache.org/kafka/entry/apache-kafka-supports-more-partitions
> >
> > “As a rule of thumb, we recommend each broker to have up to 4,000
> > partitions and each cluster to have up to 200,000 partitions”
> >
> > -hans
> >
> > > On Apr 1, 2019, at 2:02 AM, Alexander Kuterin <akute...@gmail.com>
> > wrote:
> > >
> > > Thanks, Hans!
> > > We use location specific SKU pricing and send specific price lists to
> the
> > > specific POS terminal.
> > >
> > > пн, 1 апр. 2019 г., 3:01 Hans Jespersen <h...@confluent.io>:
> > >
> > >> Doesn’t every one of the 20,000 POS terminals want to get the same
> price
> > >> list messages? If so then there is no need for 20,000 partitions.
> > >>
> > >> -hans
> > >>
> > >>> On Mar 31, 2019, at 7:24 PM, <akute...@gmail.com> <
> akute...@gmail.com>
> > >> wrote:
> > >>>
> > >>> Hello!
> > >>>
> > >>>
> > >>>
> > >>> I ask for your help in connection with the my recent task:
> > >>>
> > >>> - Price lists are delivered to 20,000 points of sale with a frequency
> > of
> > >> <10
> > >>> price lists per day.
> > >>>
> > >>> - The order in which the price lists follow is important. It is also
> > >>> important that the price lists are delivered to the point of sale
> > online.
> > >>>
> > >>> - At each point of sale, an agent application is deployed, which
> > >> processes
> > >>> the received price lists.
> > >>>
> > >>>
> > >>>
> > >>> This task is not particularly difficult. Help in solving the task is
> > not
> > >>> required.
> > >>>
> > >>>
> > >>>
> > >>> The difficulty is that Kafka in our company is a new "silver bullet",
> > and
> > >>> the project manager requires me to implement the following technical
> > >>> decision:
> > >>>
> > >>> deploy 20,000 Kafka consumer instances (one instance for each point
> of
> > >> sale)
> > >>> for one topic partitioned into 20,000 partitions - one partition per
> > >>> consumer.
> > >>>
> > >>> Technical problems obtained in experiments with this technical
> decision
> > >> do
> > >>> not convince him.
> > >>>
> > >>>
> > >>>
> > >>> Please give me references to the books/documents/blogposts. which
> > clearly
> > >>> shows that Kafka not intended for this way to use (references to
> other
> > >>> anti-patterns/pitfalls will be useful).
> > >>>
> > >>> My own attempts to find such references were unsuccessful.
> > >>>
> > >>>
> > >>>
> > >>> Thank you!
> > >>>
> > >>>
> > >>>
> > >>
> >
>

Reply via email to