This is my third kafka streams application and I'd thought I had gotten to
know the warts and how to use it correctly. But I'm beating my head against
something that I just cannot explain. Values written to a table, when later
read back in a join operation, are stale.

Assume the following simplified domain:
ModificationEvent - describes some mutation in the system
DetailRecord - a detailed record of some occurrence, contains some
metadata, and all of the modification events that occurred as part of the
occurrence/incident.

Very simple topology:
KStream<UUID, ModificationEvent>
KTable<UUID, DetailRecord>
The UUID, in this case, is the id of the incident/occurrence being
reported. When there is a "terminal" event, the DetailRecord will be
shipped to an external system.

So I have a very very simple transformation (pseudo kotlin):
eventsStream.join(detailTable) { event, detailRecord ->
  detailRecord.events.add(event)
  return detailRecord
}.to(detailTable)

So the ModificationEvent's stream is joined to the DetailRecord's table,
and the ModificationEvent is appended to the end of the DetailRecord, which
is then written back to the table.

However, on the next modification event for the same detail record, the
detail record is stale and it's list of events is empty. Whats going on
here?

I've tried temporarily disabling record caching (didn't think that was the
issue), and even setting the offset commit interval to 0 ms (again didn't
think this was the issue). Neither had an effect, other than slowing the
stream.

Definitely need some help here.
Trey

Reply via email to