Re: [ANNOUNCE] New committer: John Roesler

2019-11-12 Thread Patrik Kleindl
Congratulations John! Well deserved and thanks for all your help Best regards Patrik > Am 13.11.2019 um 06:10 schrieb Kamal Chandraprakash > : > > Congrats John! > >> On Wed, Nov 13, 2019 at 7:57 AM Dong Lin wrote: >> >> Congratulations John! >> >>> On Tue, Nov 12, 2019 at 1:56 PM Guozhang

best config for kafka 10.0.0.1 consumer.assign.

2019-11-12 Thread Upendra Yadav
Hi, I m using consumer assign method and consume with 15000 poll time out to consume single partition data from another DC. Below are my consumer configs: enable.auto.commit=false max.poll.records=4000 max.partition.fetch.bytes=4096000 key.deserializer=org.apache.kafka.common.serialization.ByteAr

Re: [ANNOUNCE] New committer: John Roesler

2019-11-12 Thread Kamal Chandraprakash
Congrats John! On Wed, Nov 13, 2019 at 7:57 AM Dong Lin wrote: > Congratulations John! > > On Tue, Nov 12, 2019 at 1:56 PM Guozhang Wang wrote: > > > Hi Everyone, > > > > The PMC of Apache Kafka is pleased to announce a new Kafka committer, > John > > Roesler. > > > > John has been contributing

Re: [ANNOUNCE] New committer: John Roesler

2019-11-12 Thread Dong Lin
Congratulations John! On Tue, Nov 12, 2019 at 1:56 PM Guozhang Wang wrote: > Hi Everyone, > > The PMC of Apache Kafka is pleased to announce a new Kafka committer, John > Roesler. > > John has been contributing to Apache Kafka since early 2018. His main > contributions are primarily around Kafka

Re: [ANNOUNCE] New committer: John Roesler

2019-11-12 Thread Mayuresh Gharat
Congratulations John! Thanks, Mayuresh On Tue, Nov 12, 2019 at 4:54 PM Vahid Hashemian wrote: > Congratulations John! > > --Vahid > > On Tue, Nov 12, 2019 at 4:38 PM Adam Bellemare > wrote: > > > Congratulations John, and thanks for all your help on KIP-213! > > > > > On Nov 12, 2019, at 6:24

Re: [ANNOUNCE] New committer: John Roesler

2019-11-12 Thread Vahid Hashemian
Congratulations John! --Vahid On Tue, Nov 12, 2019 at 4:38 PM Adam Bellemare wrote: > Congratulations John, and thanks for all your help on KIP-213! > > > On Nov 12, 2019, at 6:24 PM, Bill Bejeck wrote: > > > > Congratulations John! > > > > On Tue, Nov 12, 2019 at 6:20 PM Matthias J. Sax > >

Re: [ANNOUNCE] New committer: John Roesler

2019-11-12 Thread Adam Bellemare
Congratulations John, and thanks for all your help on KIP-213! > On Nov 12, 2019, at 6:24 PM, Bill Bejeck wrote: > > Congratulations John! > > On Tue, Nov 12, 2019 at 6:20 PM Matthias J. Sax > wrote: > >> Congrats John! >> >> >>> On 11/12/19 2:52 PM, Boyang Chen wrote: >>> Great work John!

Re: [ANNOUNCE] New committer: John Roesler

2019-11-12 Thread Bill Bejeck
Congratulations John! On Tue, Nov 12, 2019 at 6:20 PM Matthias J. Sax wrote: > Congrats John! > > > On 11/12/19 2:52 PM, Boyang Chen wrote: > > Great work John! Well deserved > > > > On Tue, Nov 12, 2019 at 1:56 PM Guozhang Wang > wrote: > > > >> Hi Everyone, > >> > >> The PMC of Apache Kafka i

Re: [ANNOUNCE] New committer: John Roesler

2019-11-12 Thread Matthias J. Sax
Congrats John! On 11/12/19 2:52 PM, Boyang Chen wrote: > Great work John! Well deserved > > On Tue, Nov 12, 2019 at 1:56 PM Guozhang Wang wrote: > >> Hi Everyone, >> >> The PMC of Apache Kafka is pleased to announce a new Kafka committer, John >> Roesler. >> >> John has been contributing to Ap

Re: [ANNOUNCE] New committer: John Roesler

2019-11-12 Thread Boyang Chen
Great work John! Well deserved On Tue, Nov 12, 2019 at 1:56 PM Guozhang Wang wrote: > Hi Everyone, > > The PMC of Apache Kafka is pleased to announce a new Kafka committer, John > Roesler. > > John has been contributing to Apache Kafka since early 2018. His main > contributions are primarily aro

[ANNOUNCE] New committer: John Roesler

2019-11-12 Thread Guozhang Wang
Hi Everyone, The PMC of Apache Kafka is pleased to announce a new Kafka committer, John Roesler. John has been contributing to Apache Kafka since early 2018. His main contributions are primarily around Kafka Streams, but have also included improving our test coverage beyond Streams as well. Besid

Leveraging DLQ for errors coming from a sink connector plugin

2019-11-12 Thread Javier Holguera
Hi, Looking at the Kafka Connect code, it seems that the built-in support for DLQ queues only works for errors related to transformations and converters (headers, key, and value). I wonder if it has been considered (and maybe discarded) to use the same mechanism for the call to the connector-plug

Re: Install kafka-connect-storage-cloud

2019-11-12 Thread Robin Moffatt
Hi Miguel! If you're using Kafka Connect in standalone mode then you need to pass it a .properties (key=value) file, not JSON. JSON is if you are using Kafka Connect in distributed mode (which personally I advocate, even on a single node), if you use that mode then you pass the JSON to the REST AP

Install kafka-connect-storage-cloud

2019-11-12 Thread Miguel Silvestre
Hi, I'm new to kafka (really newbie) and I'm trying to set this connector on my local machine which is a macOS Mojava 10.14.6. I've downloaded the connector and put the contents on folder: /usr/local/share/kafka/plugins and update the plugin.path on file /usr/local/etc/kafka/connect-standalone.pr

Re: MirrorMaker 2 Plugin class loader Error

2019-11-12 Thread Vishal Santoshi
+1 On Mon, Nov 11, 2019 at 2:07 PM Ryanne Dolan wrote: > Rajeev, the config errors are unavoidable at present and can be ignored or > silenced. The Plugin error is concerning, and was previously described by > Vishal. I suppose it's possible there is a dependency conflict in these > builds. Can

Re: kafka-console-consumer --value-deserializer with access to headers

2019-11-12 Thread M. Manna
HI, On Tue, 12 Nov 2019 at 14:37, Jorg Heymans wrote: > Thanks for helping debugging this. You can reproduce the issue using below > deserializer, and invoking kafka-console-consumer with > --value-deserializer=my.BasicDeserializer . As you will see, when the > consumer starts receiving messages

Re: kafka-console-consumer --value-deserializer with access to headers

2019-11-12 Thread Jorg Heymans
Thanks for helping debugging this. You can reproduce the issue using below deserializer, and invoking kafka-console-consumer with --value-deserializer=my.BasicDeserializer . As you will see, when the consumer starts receiving messages only "SERDE WITHOUT HEADERS" is printed to the console. Th

Re: kafka-console-consumer --value-deserializer with access to headers

2019-11-12 Thread M. Manna
HI again, On Tue, 12 Nov 2019 at 12:31, Jorg Heymans wrote: > Hi, > > The issue is not that i cannot get a custom deserializer working, it's > that the custom deserializer i provide implements the default method from > the Deserializer interface > https://github.com/apache/kafka/blob/6f0008643db

Re: kafka-console-consumer --value-deserializer with access to headers

2019-11-12 Thread Jorg Heymans
Hi, The issue is not that i cannot get a custom deserializer working, it's that the custom deserializer i provide implements the default method from the Deserializer interface https://github.com/apache/kafka/blob/6f0008643db6e7299658442784f1bcc6c96395ed/clients/src/main/java/org/apache/kafka/co

Re: kafka-console-consumer --value-deserializer with access to headers

2019-11-12 Thread M. Manna
Hi On Tue, 12 Nov 2019 at 09:53, Jorg Heymans wrote: > Indeed, i corrected the typo but now my deserializer class is not taken > into account at all and it goes back to the default deserializer. You can > verify this by putting a non-existent class and it still runs fine. > > value.deserializer=

Re: kafka-console-consumer --value-deserializer with access to headers

2019-11-12 Thread Jorg Heymans
Indeed, i corrected the typo but now my deserializer class is not taken into account at all and it goes back to the default deserializer. You can verify this by putting a non-existent class and it still runs fine. value.deserializer=does.not.exist Jorg On 2019/11/11 14:31:49, "M. Manna" wrote