d@21e8f193), responseAction=SendAction),
responseAsString=Some({throttle_time_ms=0,error_code=0})
(kafka.network.Processor)
On Mon, Jan 8, 2018 at 12:08 PM, Nishanth S wrote:
> Hi EveryOne,
> We are running into an issue where kafka producer hangs after
> inittransactions api. I have create
Hi EveryOne,
We are running into an issue where kafka producer hangs after
inittransactions api. I have created a new topic and the same issue
happens intermittently . It is not consistent but occurs more often than
not . We are running confluent 4.0 and using the below dependencies on
client s
Hi All,
What is the best way to add schemas to schema registry ?. I am using
confluent platform and our data is in avro . We use confluent kafka avro
serializers and I read that using this would register the schema in
schema registry automatically .We also use the same schemas for downstrea
Thank you !
On Tue, Dec 19, 2017 at 10:49 AM, Matthias J. Sax
wrote:
> This should help (Section "The good news: Kafka is still fast!")
>
> https://www.confluent.io/blog/exactly-once-semantics-are-
> possible-heres-how-apache-kafka-does-it/
>
> -Matthias
>
&g
Hello,
I came across this really good article on kafka benchmarking but it has
been with an older version(
https://engineering.linkedin.com/kafka/benchmarking-apache-kafka-2-million-writes-second-three-cheap-machines)
. We are planning to enable idempotence for exactly once guarantee . Has
any o
ck).
>
> -Daivd
>
> http://docs.confluent.io/current/connect/connect-hdfs/docs/index.html
>
> On 9/11/17, 4:48 PM, "Nishanth S" wrote:
>
> All,
> I am very new to kafka . We have a case where we need to ingest
> multiple
> avro record types .
All,
I am very new to kafka . We have a case where we need to ingest multiple
avro record types . These avro record types vary vastly in volume and size
and I am thinking of sending each of these message types to a different
topic and creating partitions based on volume and through put needed.
Hello,
We are investigating on ingesting avro records to kafka using avro kafka
serializer. Our schemas are nested and are of type record .Does the current
avro kafka serializer support avro record type ?.
If not is there a way to ingest records and consume using a consumer
without using a