Discovery of new topic is not a concern. Clients who creates a new topic
sends my process the topic name. In simpler words, my process performs some
operations which any client might be wanting. So they create a Kafka topic
and send me the name of the topic for me to subscribe.
My process needs st
Hi,
just to clarify: this is the cause of the crash https://pastebin.com/GuF60kvF
in the broker logs, which is why I referenced
https://issues.apache.org/jira/browse/KAFKA-4523
I had this crash some time ago and yesterday was in the process of upgrading my
brokers to 0.11.0.2 in part to addres
The assignment strategy cannot be configures in Kafka Streams atm.
How many partitions do you have in your topics? If I read in-between the
lines, it seems that all topics have just one partition? For this case,
it will be hard to scale out, as Kafka Streams scales via the number of
partitions...
The assignment strategy cannot be configures in Kafka Streams atm.
How many partitions do you have in your topics? If I read in-between the
lines, it seems that all topics have just one partition? For this case,
it will be hard to scale out, as Kafka Streams scales via the number of
partitions...
I'm a bit confused. How do you discover that new topics have been created?
If you discover a new topic somehow, how do you know it's one you are
interested in?
I suppose you could have a consumer which subscribes to all topics using
pattern=".", then have it just publish new topic names to a fixed
Hi,
I am using Kafka Streams for one of our application. The application has
several type of topics like the initial ingestion topics and the live
stream topic, all are sharing the same state in continuous fashion.
My problem is that the the assignment of these topics/partitions where I am
observ
In theory pattern "." would work. But that would mean subscribing on all
topics which I don't want.
I share same pain in using pattern which is why I would like to know if it
is good practice to subscribe to topics after kafka consumer has started.
Regards,
Chintan
On 06-Jan-2018 5:47 PM, "Skip
Ismael:
We're on the same page.
0.11.0.2 was released on 17 Nov 2017.
By 'recently' in my previous email I meant the change was newer.
Vincent:
Did the machine your broker ran on experience power issue ?
Cheers
On Sat, Jan 6, 2018 at 7:36 AM, Ismael Juma wrote:
> Hi Ted,
>
> The change you m
Hi
My kafka(0.11.0) with G1 GC is going OutOfMemory every time when I am
consuming one topic with 6 partition.
I have increase my *max.fetch.bytes.per.partition* to *10MB* because some
of messages are pretty big(very less). But when I ran with same
configuration for long time my Kafka broker got
Hi Ted,
The change you mention is not part of 0.11.0.2.
Ismael
On Sat, Jan 6, 2018 at 3:31 PM, Ted Yu wrote:
> bq. WARN Found a corrupted index file due to requirement failed: Corrupt
> index found, index file
> (/data/kafka/data-processed-15/54942918.index)
>
> Can you search back
bq. WARN Found a corrupted index file due to requirement failed: Corrupt
index found, index file
(/data/kafka/data-processed-15/54942918.index)
Can you search backward for 54942918.index in the log to see if
we can find the cause for corruption ?
This part of code was rece
This is “normal” as far as I know. We’ve seen this behavior after unclean
shutdowns of 0.10.1.1.
In the event of an unclean shutdown Kafka seems to have to rebuild some indexes
and for large data directories this takes some time. We got bit by this a few
times recently when we had boxes that po
Here's an excerpt just after the broker started: https://pastebin.com/tZqze4Ya
After more than 8 hours of recovery the broker finally started. I haven't read
through all 8 hours of log but the parts I looked at are like the pastebin.
I'm not seeing much in the log cleaner logs either, they look
In theory, wouldn't
consumer.subscribe(pattern=".")
work? I say "in theory" because my experience with subscribing by pattern
hasn't been great. I suspect my mental model of how it's implemented isn't
a close approximation of reality.
Skip
On Jan 6, 2018 4:07 AM, "chintan mavawala" wrote:
I w
I want to subscribe to all topics as they are created but topic names does
not have any pattern. Consumer starts with subscription to let's say 3
topics and add more topics to exisiting kafka consumer as they are created.
Regards,
Chintan
On 04-Jan-2018 11:02 PM, "Jordan Pilat" wrote:
> Did you
15 matches
Mail list logo