Robin,
Thanks for writing back. I am aware of this Kafka Connector. Do you have any
info on Kafka connecting to Mainframe?
Regards
Chanchal
-Original Message-
From: Robin Moffatt
Sent: Friday, September 14, 2018 3:06 PM
To: users@kafka.apache.org
Subject: Re: Need info
As a side
As a side note from your question, I'd recommend looking into Kafka
Connect. It is another API with Apache Kafka, and it simplifies the
building of pipelines with Kafka such as the one you are describing
There are pre-built connectors, including for S3 (
https://www.confluent.io/connector/kafka-con
There are different customers getting data from zOS like DB2 or IMS to Kafka.
They have written their own consumers to load the data into data warehouse like
Teradata or other data warehouse platform. I am not sure if there is a
particular customer using MongoDB, but there are some using MySQ
Hi James,
As a matter of interest this streaming data is fed into some Operational
Data Store ODS) like MongoDB?
In general using this method will create a near real time snapshot for
business users and customers.
HTH
Dr Mich Talebzadeh
LinkedIn *
https://www.linkedin.com/profile/view?id=A
We have banking customers sending data from DB2 z to Kafka linux (not cloud)
with transaction rate 30K per seconds. Kafka can handle more than this rate.
> On Sep 12, 2018, at 2:31 AM, Chanchal Chatterji
> wrote:
>
> Hi,
>
> In the process of mainframe modernization, we are attempting to str
, 2018 4:00 AM
To: users@kafka.apache.org
Subject: [External] RE: Need info
In simple words it is like :
We have MF application which is sending statement data to Kafka from internal
data sources after some processing . Which would later be pushed to cloud (
through Kafka) and will be staged in
Message-
> From: Mich Talebzadeh
> Sent: Wednesday, September 12, 2018 2:16 PM
> To: users@kafka.apache.org
> Subject: Re: Need info
>
> Hi,
>
> As I understand you are trying to create an operational data store from
> your transactional database(s) upstream?
>
> Do y
: users@kafka.apache.org
Subject: Re: Need info
Hi,
As I understand you are trying to create an operational data store from your
transactional database(s) upstream?
Do you have stats on the rate of DML in the primary source? These
insert/update/deletes need to pass to Kafka as messages. Besides
Hi,
As I understand you are trying to create an operational data store from
your transactional database(s) upstream?
Do you have stats on the rate of DML in the primary source? These
insert/update/deletes need to pass to Kafka as messages. Besides what Kafka
can handle (largely depending on the a
s
Chanchal Chatterji
Principal Consultant,
Infosys Ltd.
Electronic city Phase-1,
Bangalore-560100
Contacts : 9731141606/ 8105120766
-Original Message-
From: Liam Clarke
Sent: Wednesday, September 12, 2018 1:10 PM
To: users@kafka.apache.org
Subject: Re: Need info
The answer to you
The answer to your question is "It depends". You build your cluster right
and size your messages right and tune your producers right, you can achieve
near real time transport of terabytes of data a day.
There's been plenty of articles written about Kafka performance. E.g.,
https://engineering.lin
Hi,
In the process of mainframe modernization, we are attempting to stream
Mainframe data to AWS Cloud , using Kafka. We are planning to use Kafka
'Producer API' at mainframe side and 'Connector API' on the cloud side.
Since our data is processed by a module called 'Central dispatch' located in
@Jhon, thanks for pointing that out. I was looking for something like that.
But as Ewen pointed out, its not open sourced.
Thanks,
Sumit
On Wed, Jan 4, 2017 at 12:07 PM, Ewen Cheslack-Postava
wrote:
> Unfortunately, I don't think it has been open sourced (it doesn't seem to
> be available on ht
Unfortunately, I don't think it has been open sourced (it doesn't seem to
be available on https://github.com/paypal).
-Ewen
On Tue, Jan 3, 2017 at 5:54 PM, Jhon Davis wrote:
> Found an interesting Kafka monitoring tool but no information on whether
> it's not open sourced or not.
>
> https://me
Found an interesting Kafka monitoring tool but no information on whether
it's not open sourced or not.
https://medium.com/@rrsingh.send/monitoring-kafka-at-scale-paypal-254238f6022d
Best,
J.D.
On Mon, Jan 2, 2017 at 11:01 PM, Sreejith S wrote:
> Hi Sumit,
>
> JMX Metrics will give you lot of i
Hi Sumit,
JMX Metrics will give you lot of in depth information on Kafka.
Just try this.
https://github.com/srijiths/kafka-connectors/tree/master/kafka-connect-jmx
If you use above , then you should have a custom UI to show the metrics.
Also you can try open source kafka monitoring tool
https:
Hello Sumit,
Not yet. AFAIK few topic management requests got introduced in broker side
(see https://issues.apache.org/jira/browse/KAFKA-2229 ) but not yet in
client APIs. Querying/listing topics request doesn't seem even to be
planned yet.
AdminUtils which talks with ZooKeeper via ZkClient is Ka
Hi,
I am looking to get information about individual brokers in kafka cluster.
The information that I am looking for is:
- List of topics in a broker
- Partitions for each topic in a broker
- Metrics like BytesIn/Out Per min, Messages In/Min per topic
- ...
I have tried looking into
18 matches
Mail list logo