Hello all,
I am unable to purge the topic data from Kafka. Is there any class to flush
all topic data.
Thank you
I think this can be done in two ways.
1. Kstream or Ktable filter in a topology.
2. Store data in a persistent store elsewhere and expose via API (like
Cassandra)
--
Rahul Singh
rahul.si...@anant.us
Anant Corporation
On Apr 19, 2018, 7:07 AM -0500, joe_delbri...@denso-diam.com, wrote:
> I
Without JMX may be difficult.. why not install an agent and report to an
external service like ELK or new Relic?
That’s long standing industry pattern.
Some reading.. and some tools in the readings.. these articles are opinionated
towards the vendors that published them but its a starting point
Check my curated list here. I’ll add this to an “awesome-Kafka” when I have
some time but you should be able to see 7-8 links I added recently specifically
on this subject.
http://leaves.anant.us/#!/?tag=kafka,monitoring
--
Rahul Singh
rahul.si...@anant.us
Anant Corporation
On Apr 25, 2018
t field, let me know.
Best,
--
Rahul Singh
rahul.si...@anant.us
Anant Corporation
anyone has a solution.
Thanks,
Rahul Singh
Seems like you need to expose your port via docker run or docker-compose .
https://docs.docker.com/v17.09/engine/userguide/networking/default_network/binding/
--
Rahul Singh
rahul.si...@anant.us
Anant Corporation
On Jul 9, 2018, 2:21 PM -0500, Mich Talebzadeh ,
wrote:
> Hi,
>
> I
favorite — as I was using parboiled2 to build a parser —
libraries like shapeless
Best
--
Rahul Singh
rahul.si...@anant.us
Anant Corporation
On Jul 23, 2018, 8:40 AM -0400, M. Manna , wrote:
> Hello,
>
> Is anyone aware of any links or website where I can find information/case
> s
sers may I
> suggest looking at ANTLR?
>
> Idiomatic scala code can be expensive *as curremtly implemented*. Just
> understand that cost by profiling, and de-idiomise in hot code as
> needed.
>
> It's a fab language.
>
> jan
>
> On 23/07/2018, Rahul Singh wrote:
&
Graylog, or Kibana (connected to ElasticSearch)
Rahul
On Aug 3, 2018, 5:12 PM -0400, Akash Jain , wrote:
> In my Java application, I am using a third party library to collect the
> memory profile, and send to a Kafka topic as JSON data.
>
> How can I make sense out of that JSON data? Is there a t
I would recommend using Docker — it would end up being run on a Linux kernel VM
on windows and is easier to get started on with a bit of learning curve for
Docker. Less time wasted overall and at least at that point you would know
Docker.
Rahul
On Aug 7, 2018, 4:50 AM -0400, jan , wrote:
> I tr
Why do you need that many partitions or topics — what’s the business use case.
Rahul Singh
Chief Executive Officer
m 202.905.2818
Anant Corporation
1010 Wisconsin Ave NW, Suite 250
Washington, D.C. 20007
We build and manage digital business technology platforms.
On Dec 10, 2018, 12:09 PM -0500
Hi All,
I am testing kafka locally, I am able to produce and consume message. But,
after consuming the message from topic I want to acknowledge.
Looking for solution. Please revert if anyone have.
Thanks & Regards
Rahul Singh
I am using in Node with node-kafka module.
On Mon, Jan 21, 2019 at 6:45 PM M. Manna wrote:
> Please read KafkaConsumer javadoc - your answer is already there.
>
> Thanks,
>
> On Mon, 21 Jan 2019 at 13:13, Rahul Singh <
> rahul.si...@smartsensesolutions.com> wrote:
&g
rdkafka/blob/master/README.md
>
> -hans
>
> > On Jan 21, 2019, at 5:17 AM, Rahul Singh <
> rahul.si...@smartsensesolutions.com> wrote:
> >
> > I am using in Node with node-kafka module.
> >
> >> On Mon, Jan 21, 2019 at 6:45 PM M. Manna w
he Kafka 0.8 and
> stores offsets in zookeeper (which Kafka 0.9 and above no longer do).
>
> I recommend you use a more up to date nodejs kafka client than this one.
>
> -hans
>
> > On Jan 21, 2019, at 10:02 AM, Rahul Singh <
> rahul.si...@smartsensesolutions.com> wrot
rror Occured ${err}`)
});
Here, the autoCommit property is set to false and committing manually by
consumerGroup.commit(), but when I restart the consumer it consumes all the
offsets from starting.
Thanks
On Mon, Jan 21, 2019 at 11:48 PM Daniel Hinojosa <
dhinoj...@evolutionnext.com> wrote:
> Sh
There is no limit for partitioning in Kafka. It would be good the number of
partitions is equal to number of consumers. The consumer fetches a batch of
messages per partition. The more partitions that a consumer consumes, the
more memory it needs.
On Wed, Jan 23, 2019 at 12:25 PM marimuthu eee
wr
Hi All,
I am facing error while creating topic manually using kafka-node client.
The code is mentioned below.
Can anyone help please?
let topicsToCreate = [{ topic: topicName, partitions: 1, replicationFactor:
2 }];
admin.createTopics(topicsToCreate, (err, data) => {
if (err)
Hi Garvit,
You can check here https://kafka.apache.org/documentation
Thanks,
Rahul
On Tue, Jun 25, 2019 at 4:11 PM Garvit Sharma wrote:
> Hi All,
>
> I am looking for Kafka consumer API documentation to understand how it
> works internally.
>
> I am facing a problem where my consumer group is
Kafka Manager is good tool.
On Sat, Aug 24, 2019, 9:28 AM Darius Cooper wrote:
> I'm looking for recommendations for a simple UI -based tool that will help
>
> * viewing lists of Kafka topics,
> * viewing Consumer groups for each topics
> * viewing messages for a topic
> * post test messages to
21 matches
Mail list logo