Hi Liam - I did not understand cloning kafka broker volumes. if you have 1 TB disk , Assuming the usage is 65% data in the volume is changing so fast. take 650 GB every hour or every min ? how do we restore if there was failure?
Oracle database provides point in time recovery(incremental+full backup+archive logs) of their database. is it possible to recover kafka like that? We had storage failure of the entire site. We were not confident on the data recovered on kafka compared to oracle database. We had 3 kafka nodes with no mirror maker. My understanding is replication and mirror maker works until there is no lag on replication. There is no guarantee of data loss. I never tested mirror maker with compact topics. Thanks, pt On Sun, Aug 9, 2020 at 7:52 AM Liam Clarke-Hutchinson < liam.cla...@adscale.co.nz> wrote: > Hi Dor, > > There are multiple approaches. > > 1) Clone your Kafka broker volumes > 2) Use Kafka Connect to stream all data to a different storage system such > as Hadoop, S3, etc. > 3) Use Mirrormaker to replicate all data to a backup cluster. > > Which approach is right for you really depends on your needs, but > generally, if you have enough nodes in your clusters, and a correct > replication setting for a topic, you won't need to backup Kafka. As a rule > of thumb, a topic with a replication factor of N can survive N - 1 node > failures without data loss. > > If you can provide more information about the problems you're trying to > solve, our advice can be more directed :) > > Kind regards, > > Liam Clarke-Hutchinson > > On Sun, Aug 9, 2020 at 11:43 PM Dor Ben Dov <dor.ben-...@amdocs.com> > wrote: > > > Hi All, > > What is the best recommended way, and tool to backup kafka in production? > > Regards, > > Dor > > > > This email and the information contained herein is proprietary and > > confidential and subject to the Amdocs Email Terms of Service, which you > > may review at https://www.amdocs.com/about/email-terms-of-service < > > https://www.amdocs.com/about/email-terms-of-service> > > > > >