Massimiliano, I would work this with some clever scheduling. Create a few of schedules.
Schedule { Name = "Weekly-incremental" Run = Level=Incremental mon-fri at 20:00 } Schedule { Name = "Weekly-full" Run = Level=Full Fri at 21:00 } Schedule { Name = "Catalog-backup" Run = Full mon-fri at 3:10 } Essentially create Jobs to do incremental on the Weekly-incremental schedule and then create your weekly full on the Weekly-full schedule this will schedule the full jobs right after the incremental if you have the incremental always sync between sites then the Fulls shouldn't matter that much just make sure you have a local copy for easy recovery. Make sure your catalog is backed up offsite every day. Making separate pools for your data is something else to do. One pool is synced and another is not. (Weekly-full) Another thing to consider is space. Are you going to want to keep a year of incremental backups on each site? You may want to keep just 3 months and do quarterly remote site backups. This will help with keeping retention times of your backups in check. Weekly-full backups will only need to be kept for a week however your Weekly-incremental will need to be kept until you redo your annual/quarterly full. Additionally backups that need to be sent offsite could be spooled to files and then rsynced or simply copied using a cronjob. With this setup if you remove a file from the source but it is still marked as append then the first job will error and get rescheduled. The volume will be marked as error and it will clutter the database and pull them from whatever rotation you choose to use. Possible but unlikely problems with this setup include: If someone manages to change a file between when the Weekly-incremental is completed and the weekly-full starts. This data will be lost if you try to restore after the Weekly-full is purged. This will be a big problem if the data is never touched and picked up with the next incremental. Hope this helps. Doug Forster On 06/25/2010 03:39 PM, Massimiliano Perantoni wrote: > Hi! > I'm going to plan a remote backup between two places in my company > that are connected with 10Mbps connectivity. I need to backup, to be > sure I will not have problems for disaster recovery, all the data in > "location A" to "location B". Up to the moment we have a local backup > working like a charm with bacula, the matter is the remote backup; we > have 4TB of data and growing and we calculated that we would spend a > lot of our time and bandwidth doing the backup, so we would like to > create a first full backup of "location A" (maybe yearly) to be sent > to "location B", where we would just do the incremental ones: the > great matter is that we would like to do it as automatically as > possible: the matter is that for what I understood up till now is that > bacula works on job names, not on backupped files, so that if I backup > everything with a job, a different job on the same fileset would start > with a full backup. > > How to solve this problem? > > Any help would be really appreciated... > > Ciao Massimiliano > > ------------------------------------------------------------------------------ > ThinkGeek and WIRED's GeekDad team up for the Ultimate > GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the > lucky parental unit. See the prize list and enter to win: > http://p.sf.net/sfu/thinkgeek-promo > _______________________________________________ > Bacula-users mailing list > Bacula-users@lists.sourceforge.net > https://lists.sourceforge.net/lists/listinfo/bacula-users > ------------------------------------------------------------------------------ ThinkGeek and WIRED's GeekDad team up for the Ultimate GeekDad Father's Day Giveaway. ONE MASSIVE PRIZE to the lucky parental unit. See the prize list and enter to win: http://p.sf.net/sfu/thinkgeek-promo _______________________________________________ Bacula-users mailing list Bacula-users@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/bacula-users