I am accessing kafka through java code. The use case is, I want to publish
stream of messages on two kafka topic, which would be in a transaction.

My point of concern is would it be better to use one single global kafka
producer for all messages or invoking and deleting a new kafkaPoducer for
every new message to publish?

What would be the pros and cons of new producer creation and deletion for
every request and can a single producer handle all transactional calls.

-- 



[image: http://mediaiqdigital.com/] <http://mediaiqdigital.com/>


*Shubham Dhoble*


*Software Engineer*



*a.*  5th & 6th Floor | SKAV 909 | 9/1, Lavelle Road | Bangalore |  560001

*m.*  +91 7728095515

*e*. shub...@mediaiqdigital.com <ras...@mediaiqdigital.com>

*s.* shubham dhoble


<http://mediaiqdigital.com/>
<https://www.facebook.com/pages/MEDIA-iQ-Digital/128688543950309>
<https://twitter.com/MediaIQDigital>
<https://www.linkedin.com/company/media-iq-digital-ltd>
<http://inspirethroughinsights.com/>

Reply via email to