[
https://issues.apache.org/jira/browse/KAFKA-1827?page=com.atlassian.jira.plugin.system.issuetabpanels:all-tabpanel
]
Jay Kreps resolved KAFKA-1827.
--
Resolution: Won't Fix
Yeah this is interesting but currently out of scope.
> Optimistic Lock
losed -> MoneyWithdrawn is invalid. The only way I can think of to
> ensure this is the case is to have optimistic locking. Each account would
> have a unique key, and in order to write an event to Kafka the secondary
> store *must* be up-to-date with the previous events for that key. If
en account, so the event sequence
AccountClosed -> MoneyWithdrawn is invalid. The only way I can think of to
ensure this is the case is to have optimistic locking. Each account would
have a unique key, and in order to write an event to Kafka the secondary
store *must* be up-to-date with the previous ev
Regarding the use case – some events may only be valid given a specific
state of the world, e.g. a NewComment event and a PostClosedForComments
event would be valid only in that order. If two producer processes (e.g.
two HTTP application processes) tries to write an event each, you may get
integrit
Hi,
Couple of comments on this.
What you're proposing is difficult to do at scale and would require some
type of Paxos style algorithm for the update only if different - it would
be easier in that case to just go ahead and do the update.
Also, it seems like a conflation of concerns - in an event
I'm trying to design a system that uses Kafka as its primary data store by
persisting immutable events into a topic and keeping a secondary index in
another data store. The secondary index would store the "entities". Each
event would pertain to some "entity", e.g. a user, and those entities are
sto
Daniel Schierbeck created KAFKA-1827:
Summary: Optimistic Locking when Producing Messages
Key: KAFKA-1827
URL: https://issues.apache.org/jira/browse/KAFKA-1827
Project: Kafka
Issue Type
I'm trying to design a system that uses Kafka as its primary data store by
persisting immutable events into a topic and keeping a secondary index in
another data store. The secondary index would store the "entities". Each
event would pertain to some "entity", e.g. a user, and those entities are
sto
I'm trying to design a system that uses Kafka as its primary data store by
persisting immutable events into a topic and keeping a secondary index in
another data store. The secondary index would store the "entities". Each
event would pertain to some "entity", e.g. a user, and those entities are
sto