Hi Richardo,

What you described seems very similar to the demo example code as stated
here:
https://github.com/apache/kafka/blob/trunk/streams/quickstart/java/src/main/resources/archetype-resources/src/main/java/Pipe.java

If you started the program it should just pipe all data starting from the
earliest offset and pipe it to the target topic, no matter how much data
the source topic already have stored.


Guozhang



On Sat, Aug 12, 2017 at 2:35 PM, Eno Thereska <eno.there...@gmail.com>
wrote:

> Hi Ricardo,
>
> Kafka Streams should handle that case as well. What streams config are you
> using, could you share it? There is one parameter that is called
> “ConsumerConfig.AUTO_OFFSET_RESET_CONFIG” and by default it’s set to
> “earliest”. Any chance your app has changed it to “latest”?
>
> Thanks
> Eno
>
> > On Aug 12, 2017, at 5:13 PM, Ricardo Costa <rdsco...@gmail.com> wrote:
> >
> > Hi,
> >
> > I've implemented a forwarding consumer which literally just consumes the
> > messages from a source topic, logs them and then publishes them to a
> target
> > topic.
> >
> > I wanted to keep the implementation simple with very little code so I
> went
> > with kafka-streams. I have a really simple topology with a source for the
> > source topic, a sink for the target topic and a logging processor
> > in-between.
> >
> > I'm quite happy with the solution, really simple and elegant, I ran some
> > basic tests and everything seemed to be working. As I went on to build
> more
> > test cases, I found that the stream only does its thing if I push
> messages
> > to the source topic *after* creating the stream and waiting until it is
> > fully initialized. Is this the expected behaviour? I need the stream to
> be
> > started at any point in time and forward the messages that were buffered
> on
> > the source topic until then. Are kafka-streams not fit for this use case?
> > Or am I missing something?
> >
> > Thanks in advance!
> >
> > --
> > Ricardo
>
>


-- 
-- Guozhang

Reply via email to