The new KafkaConsumer fro Kafka 0.9 should be able to handle this, as the
Kafka Client Code itself has support for this then.
For 0.8.x, we would need to implement support for recovery inside the
consumer ourselves, which is why we decided to initially let the Job
Recovery take care of that.
If th
What I actually meant was partition reassignment (
https://cwiki.apache.org/confluence/display/KAFKA/Replication+tools#Replicationtools-6.ReassignPartitionsTool
).
No topics were deleted.
We added a bunch of new servers and needed to reassign some partitions to
spread the load.
No, I haven't set t
Hi Jakob,
what do you exactly mean by rebalance of topics? Did the leader of the
partitions change?
Were topics deleted?
Flink's KafkaConsumer does not try to recover from these exceptions. We
rely on Flink's fault tolerance mechanisms to restart the data consumption
(from the last valid offset).
We did some rebalance of topics in our Kafka cluster today. I had a flink
job running and it crashed when some of the partitions were moved, other
consumers (non flink) continued to work.
Should I configure it differently or could this be a bug?
09/24/2015 15:34:31 Source: Custom Source(3/4)
Hi,
did you manually add a Kafka dependency into your project? Maybe you are
overwriting the Kafka version to a lower version?
I'm sorry that our consumer is crashing when its supposed to read an
invalid topic .. but In general, thats a good behavior ;)
Maybe you can check whether the topic exis
Hit another problem. It is probably related to a topic that still exist in
zk but is not used anymore (therefore no partitions) or I want to start a
listener for a topic that hasn't yet been created. I would like it not to
crash.
Also, some funny Scala <-> Java
Exception in thread "main" java.lan
That will work. We have some utility classes for exposing the ZK-info.
On Fri, Sep 18, 2015 at 10:50 AM, Robert Metzger
wrote:
> Hi Jakob,
>
> currently, its not possible to subscribe to multiple topics with one
> FlinkKafkaConsumer.
>
> So for now, you have to create a FKC for each topic .. so
Hi Jakob,
currently, its not possible to subscribe to multiple topics with one
FlinkKafkaConsumer.
So for now, you have to create a FKC for each topic .. so you'll end up
with 50 sources.
As soon as Kafka releases the new consumer, it will support subscribing to
multiple topics (I think even wit
Hi,
Would it be possible to get the FlinkKafkaConsumer to support multiple
topics, like a list?
Or would it be better to instantiate one FlinkKafkaConsumers per topic and
add as a source?
We have about 40-50 topics to listen for one job.
Or even better, supply a regexp pattern that defines the qu