[ https://issues.apache.org/jira/browse/KAFKA-7971?page=com.atlassian.jira.plugin.system.issuetabpanels:comment-tabpanel&focusedCommentId=16780418#comment-16780418 ]
Maciej Lizewski commented on KAFKA-7971: ---------------------------------------- Yes it does sound right. It is pretty much what it already did to achieve my goal. What I am trying to say here is that this Transformer still has to be attahed to some input stream and mus have transform() method returning "null" - just dropping any incoming message. That is not very good style of coding for me... The black box component here should have only output stream and no input connector. > Producer in Streams environment > ------------------------------- > > Key: KAFKA-7971 > URL: https://issues.apache.org/jira/browse/KAFKA-7971 > Project: Kafka > Issue Type: Improvement > Components: streams > Reporter: Maciej Lizewski > Priority: Minor > Labels: newbie > > Would be nice to have Producers that can emit messages to topic just like any > producer but also have access to local stores from streams environment in > Spring. > consider case: I have event sourced ordering process like this: > [EVENTS QUEUE] -> [MERGING PROCESS] -> [ORDERS CHANGELOG/KTABLE] > Merging process uses local storage "opened orders" to easily apply new > changes. > Now I want to implement process of closing abandoned orders (orders that were > started, but for too long there was no change and they hang in beginning > status). Easiest way is to periodically scan "opened orders" store and > produce "abandon event" for every order that meets criteria. The obnly way > now i to create Transformer with punctuator and connect output to [EVENTS > QUEUE]. That is obvious. but Transformer must be also connected to some input > stream, but these events must be dropped as we want only the punctuator > results. This causes unnecessary overhead in processing input messages > (although they are just dropped) and it is not very elegant. -- This message was sent by Atlassian JIRA (v7.6.3#76005)