Hi James,

As a matter of interest this streaming data is fed into some Operational
Data Store ODS) like MongoDB?

In general using this method will create a near real time snapshot for
business users and customers.

HTH


Dr Mich Talebzadeh



LinkedIn * 
https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw
<https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>*



http://talebzadehmich.wordpress.com


*Disclaimer:* Use it at your own risk. Any and all responsibility for any
loss, damage or destruction of data or any other property which may arise
from relying on this email's technical content is explicitly disclaimed.
The author will in no case be liable for any monetary damages arising from
such loss, damage or destruction.




On Wed, 12 Sep 2018 at 17:58, James Kwan <jwkwan2...@gmail.com> wrote:

> We have banking customers sending data from DB2 z to Kafka linux (not
> cloud) with transaction rate 30K per seconds.  Kafka can handle more than
> this rate.
>
> > On Sep 12, 2018, at 2:31 AM, Chanchal Chatterji <
> chanchal.chatte...@infosys.com> wrote:
> >
> > Hi,
> >
> > In the process of mainframe modernization, we are attempting to stream
> Mainframe data to AWS Cloud , using Kafka.  We are planning to use Kafka
> 'Producer API' at mainframe side and 'Connector API' on the cloud side.
> > Since our data is processed by a module called 'Central dispatch'
> located in Mainframe and is sent to Kafka.  We want to know what is rate of
> volume Kafka can handle.  The other end of Kafka is connected to an AWS S3
> Bucket
> > As sink.  Please help us to provide this info or else please help to
> connect with relevant person who can help us to understand this.
> >
> > Thanks and Regards
> >
> > Chanchal Chatterji
> > Principal Consultant,
> > Infosys Ltd.
> > Electronic city Phase-1,
> > Bangalore-560100
> > Contacts : 9731141606/ 8105120766
> >
>
>

Reply via email to