In simple words it is like :

We have MF application which is sending  statement data to Kafka from internal 
data sources after some processing .  Which would later be pushed to cloud ( 
through Kafka) and will be staged in Amazon S3 bucket in cloud.
The first time entire relevant data will be pushed  ( By MF Application ) to 
the cloud and then from the next time onward, any data which would be updated ( 
Any new transaction in MF ) will be sent to the cloud through Kafka.
Can kafka support this type of streaming ?  I am more concerned to understand 
what is the max sixe of intake of data Kafka can handle in use cases like this.

Please help!

Regards

Chanchal Chatterji 
Principal Consultant,
Infosys Ltd.
Electronic city Phase-1,
Bangalore-560100
Contacts : 9731141606/ 8105120766




-----Original Message-----
From: Mich Talebzadeh <mich.talebza...@gmail.com> 
Sent: Wednesday, September 12, 2018 2:16 PM
To: users@kafka.apache.org
Subject: Re: Need info

Hi,

As I understand you are trying to create an operational data store from your 
transactional database(s) upstream?

Do you have stats on the rate of DML in the primary source? These 
insert/update/deletes need to pass to Kafka as messages. Besides what Kafka can 
handle (largely depending on the architecture you have created and your Kafka 
cluster plus Zookeeper ensemble.), you will be constraint by the rate that you 
will be allowed to extract data from Mainframe and your shared network.  Also 
for Kafka and Zookeeper you may think about deploying these as microservices.

HTH,

Dr Mich Talebzadeh



LinkedIn * 
https://apac01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.linkedin.com%2Fprofile%2Fview%3Fid%3DAAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw&amp;data=01%7C01%7Cchanchal.chatterji%40infosys.com%7C9292c112fe0c4a30a77608d6188c60f2%7C63ce7d592f3e42cda8ccbe764cff5eb6%7C1&amp;sdata=uRUnySk5RbJAwbnomkiAY0yf28jADHZ%2B0%2FGLn0fQxjU%3D&amp;reserved=0
<https://apac01.safelinks.protection.outlook.com/?url=https%3A%2F%2Fwww.linkedin.com%2Fprofile%2Fview%3Fid%3DAAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw&amp;data=01%7C01%7Cchanchal.chatterji%40infosys.com%7C9292c112fe0c4a30a77608d6188c60f2%7C63ce7d592f3e42cda8ccbe764cff5eb6%7C1&amp;sdata=uRUnySk5RbJAwbnomkiAY0yf28jADHZ%2B0%2FGLn0fQxjU%3D&amp;reserved=0>*



https://apac01.safelinks.protection.outlook.com/?url=http%3A%2F%2Ftalebzadehmich.wordpress.com&amp;data=01%7C01%7Cchanchal.chatterji%40infosys.com%7C9292c112fe0c4a30a77608d6188c60f2%7C63ce7d592f3e42cda8ccbe764cff5eb6%7C1&amp;sdata=bW0wyOvsEFfrvG%2F%2Bdf%2B72cIaB1kojxEPe7MrpeDvx58%3D&amp;reserved=0


*Disclaimer:* Use it at your own risk. Any and all responsibility for any loss, 
damage or destruction of data or any other property which may arise from 
relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from such 
loss, damage or destruction.




On Wed, 12 Sep 2018 at 08:32, Chanchal Chatterji < 
chanchal.chatte...@infosys.com> wrote:

> Hi,
>
> In the process of mainframe modernization, we are attempting to stream 
> Mainframe data to AWS Cloud , using Kafka.  We are planning to use 
> Kafka 'Producer API' at mainframe side and 'Connector API' on the cloud side.
> Since our data is processed by a module called 'Central dispatch' 
> located in Mainframe and is sent to Kafka.  We want to know what is 
> rate of volume Kafka can handle.  The other end of Kafka is connected 
> to an AWS S3 Bucket As sink.  Please help us to provide this info or 
> else please help to connect with relevant person who can help us to 
> understand this.
>
> Thanks and Regards
>
> Chanchal Chatterji
> Principal Consultant,
> Infosys Ltd.
> Electronic city Phase-1,
> Bangalore-560100
> Contacts : 9731141606/ 8105120766
>
>

Reply via email to