Hi Ashock, Check this article of mine
Real Time Processing of Trade Data with Kafka, Flume, Spark, Hbase and MongoDB https://www.linkedin.com/pulse/real-time-processing-trade-data-kafka-flume-spark-talebzadeh-ph-d-/ under *Kafka setup* HTH Dr Mich Talebzadeh LinkedIn * https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw <https://www.linkedin.com/profile/view?id=AAEAAAAWh2gBxianrbJd6zP6AcPCCdOABUrV8Pw>* http://talebzadehmich.wordpress.com *Disclaimer:* Use it at your own risk. Any and all responsibility for any loss, damage or destruction of data or any other property which may arise from relying on this email's technical content is explicitly disclaimed. The author will in no case be liable for any monetary damages arising from such loss, damage or destruction. On Thu, 25 Apr 2019 at 16:18, ASHOK MACHERLA <iash...@outlook.com> wrote: > Dear Team > > Could you please tell me about kafka topics, > > we have getting data from source to kafka about 200GB for daily basics > > how many partitions are required to pull data from source. > > > tell me about how to set no.of partitions for a topic?? > > is there any mathematical rules for partitions?? > > Sent from Outlook<http://aka.ms/weboutlook> >