Hi,
I'm new to Kafka and I'm trying to design a wrapper library in both Java
and Go (uses Confluent/Kafka-Go) for Kafka to be used internally. For my
use-case, CommitSync is a crucial step and we should do a read only after
properly committing the old one. Repeated processing is not a big issue an
Hi,
I do see a delay of about 4 to 5 minutes for initial rebalance to trigger when
using KafkaConsumer.subscribe(Pattern pattern, ConsumerRebalanceListener
listener) signature to subscribe.
Due to this, none of the subscribers are fetching any messages, for that
duration, although messa
Hi all,
I was wondering how I could prevent blocking when using KafkaProducer to
send records with a full buffer. I noticed from the v0.9 docs that there
was a block.on.buffer.full config that could be set to false to achieve
that behaviour...however, that was deprecated and is unavailable in v2.
Hi,
According to cloudera manager kafka documentation:
log.retention.ms: The maximum time before a new log segment is rolled out.
If both log.retention.ms and log.retention.bytes are set, a segment is
deleted when either limit is exceeded. The special value of -1 is
interpreted as unlimited. This p
There is difference between retention.bytes and retention.ms. Yes,
retention.bytes can be set to -1, but nowhere in the docs says about
retention.ms = -1
It might be possible that using -1 it's accepting as a "Value" - which
means it might not be validating. But it's not on official docs. I would
Good time of the day,
We are using kafka 1.0.1, and want to create a permanent topic. One online
post suggests setting retention.ms and retention.bytes to -1. The sample
below shows system accepts -1 correctly, but I don't see this documented
anywhere explicitly in the official documentation.
Co
Hi Hans,
Thanks for quick response.
I am gonna look into it.
Thanks
Pulkit
On Fri, Mar 15, 2019 at 11:39 AM Hans Jespersen wrote:
> Take a look at kafka-connect-spooldir and see if it meets your needs.
>
> https://www.confluent.io/connector/kafka-connect-spooldir/
>
> This connector can monitor
Take a look at kafka-connect-spooldir and see if it meets your needs.
https://www.confluent.io/connector/kafka-connect-spooldir/
This connector can monitor a directory and pick up any new files that are
created. Great for picking up batch files, parsing them, and publishing each
line as if it w
Hi All,
I am building a data pipeline to send logs from one data source to the
other node.
I am using Kafka Connect standalone for this integration.
Everything works fine but the problem is on Day1 the log file is renamed as
log_Day0 and a new log file log_Day1 is created.
And my Kafka Connect do