Hello
We are consuming two topics (A and B) and joining them, but I have noticed
no matter what I do, topic A gets consumed first in a batch and then topic
B , increasing *num.stream.threads* will only get topic A process a lot of
records faster. Topic B has lots of messages compared to Topic A
e or less a wild guess
> > at the moment.)
> >
> > Did you enable caching for the store? (Just to double check if it could
> > be caching related or not.)
> >
> >
> > -Matthias
> >
> >
> > On 12/24/21 11:08 AM, Miguel González wrote:
Hi team
So I ran into a complicated issue, something which I believe Kafka Streams
is not prepared for.
Basically my app is reading from two topics and joining them.
But when testing the in my staging environment I found, one topic moves
faster than the other one basically pushing stream time fo
Hello
I'm using Kafka Streams and I have a transformer that uses
a TimestampedKeyValueStore, I have a punctuator that is in charge of
cleaning the store,
Basically I'm iterating the store using kvStore.all() and deleting the keys
based on some logic with kvStore.delete(key);
I'm seeing the chang
Hello
I'm testing a Kafka Streams app in my staging environment, my app reads
from two input topics but it seems the topics start with months of
difference.
Is there a way to configure KafkaStreams or the consumer it uses to start
from a specific offset?
thanks
- Miguel
try to achieve it by using *KStream-GlobalKTable left
> join*,
> > where the GlobalKTable should read all records at the right topic, and
> then
> > doing the left join operation. This should then output either (A,B), or
> (A,
> > null).
> >
> > Thank you.
&g
Hello
So I've been using a Streams app to join two input topics... the messages
have a certain order... but I have seen the messages on the output topic
arriving with a different ordering Even before, when doing a
map/flatmap operation are processed with different ordering.
Example:
Stream
Hello
I have been developing a Kafka Streams app that takes as input two topics
as KStreams, processes them in some way and joins them and sends the
combined message to an output topic.
Here's some code,
final StreamJoined joinParams =
StreamJoined.with(
STRING_SERDE,
StreamS
Hello
For my use case I need to work with a chuck of records, let's say per
month... We have over two years of data... and we are testing if we can
deploy it to production, but we need to test in small batches.
I have built a Kafka Streams app that processes two input topics and output
to one top
Hello there
Is it possible to pause/restart a Kafka streams app? I have only found this
discussion
https://groups.google.com/g/confluent-platform/c/Nyj3eN-3ZlQ/m/lMH-bFx-AAAJ
about using map to call an external service and loop until some condition
completes
regards
- Miguel
10 matches
Mail list logo