Hello All,
I am trying to push some data into a topic using KafkaProducer API. But the
initialization is giving a NullPointer Exception.Tried debugging but could
not figure it out.
Kafka Maven version is 2.2.0
Below is the code & Exception:
> import java.util.Properties;
> import java.util.Rand
MG>below
From: Chris Constantin
Sent: Friday, July 19, 2019 3:15 PM
To: users@kafka.apache.org
Subject: Re: topic admin over SSL/SASL
I think I found the answer: use --command-config param.
MG>as long as your OPS people wont mind implementing CLI option command-
MG>below
From: Chris Constantin
Sent: Friday, July 19, 2019 3:15 PM
To: users@kafka.apache.org
Subject: Re: topic admin over SSL/SASL
I think I found the answer: use --command-config param.
Not ideal since since it requires a file, but will do.
Thanks all,
Chri
I am also interested in learning how others are handling this.
I also support several services where average message processing time takes
20 seconds per message but p99 time is about 20 minutes and the
stop-the-world rebalancing is very painful
On Fri, Jul 19, 2019, 11:38 AM Raman Gupta wrote:
I believe if you name your stores, the directory name will reflect that. If
you see any directories in your state.dir corresponding to the store names,
that's rocksdb. As the name suggest, in memory stores are held (completely)
in memory
On Fri, Jul 19, 2019 at 3:21 AM Muhammed Ashik wrote:
> Hi
I think I found the answer: use --command-config param.
Not ideal since since it requires a file, but will do.
Thanks all,
Chris
On Fri, Jul 19, 2019 at 12:10 PM Chris Constantin <
chris.constan...@aligned.io> wrote:
> Hi Martin,
>
> thanks for the quick response. Let me clarify:
>
> I am runni
Hi Martin,
thanks for the quick response. Let me clarify:
I am running kafka-topics.sh in a docker container that is not part of the
cluster that I'm trying to create topics in.
I am able to produce/consume over SSL/SASL from that same container, but
that's because I can provide the producer/cons
MG>below
From: Chris Constantin
Sent: Friday, July 19, 2019 2:09 PM
To: users@kafka.apache.org
Subject: topic admin over SSL/SASL
Hi,
How can the security.protocol config can be passed in to
'kafka-topics'/TopicAdmin when creating topics?
I can pass
in -Djavax.n
I've found
https://cwiki.apache.org/confluence/display/KAFKA/Incremental+Cooperative+Rebalancing:+Support+and+Policies
and
https://cwiki.apache.org/confluence/display/KAFKA/Incremental+Cooperative+Rebalancing+for+Streams.
This is *exactly* what I need, right down to the Kubernetes pod
restart cas
Hi,
How can the security.protocol config can be passed in to
'kafka-topics'/TopicAdmin when creating topics?
I can pass
in -Djavax.net.ssl.trustStore, -Djavax.net.ssl.trustStorePassword
-Djava.security.auth.login.config via KAFKA_OPTS, but I can't figure out
how to set security.protocol and sasl.m
I have a situation in which the current rebalancing algorithm seems to
be extremely sub-optimal.
I have a topic with 100 partitions, and up to 100 separate consumers.
Processing each message on this topic takes between 1 and 20 minutes,
depending on the message.
If any of the 100 consumers dies o
Hi all,
Tl;dr: I want to set up a subscription based on patterns that will match
non-existent topics and receive all messages that match (e.g. test.*), missing
first message to any new matching topic, set auto.create.topics.enable=true and
allow.auto.create.topics=true and very low metadata.max
Do you get any output from *docker logs*? And does it work if you don't use
authentication?
How about if you try one of the dockerised Kafka Connect examples here?
https://github.com/confluentinc/demo-scene/tree/master/kafka-connect-zero-to-hero
--
Robin Moffatt | Senior Developer Advocate | ro
Hi If I’m not wrong I remember seeing the streams code ., that default
rocksdb state.dir is rocksdb itself. Any content would go under
/tmp/Kafka-streams/rocksdb
On Fri, 19 Jul 2019 at 1:55 AM, Sophie Blee-Goldman
wrote:
> And all four stores (BucketData, CacheData, StationKeyValue,
> StationCac
Hey everyone,
I am having some issues when running Kafka Connect inside of Docker, so
would really appreciate some feedback on what I'm doing wrong.
I'm able to run locally (by executing `connect-distributed
config.properties`. However, when running in Docker and passing the same
configuration as
15 matches
Mail list logo