"Maarten Hoogveld" <m.hoogv...@elevate.nl> wrote on 30/12/2008 10:59:40 
PM:

> [image removed] 
> 
> [Bacula-users] How to handle long, time consuming backups
> 
> Maarten Hoogveld 
> 
> to:
> 
> bacula-users
> 
> 31/12/08 12:13 AM
> 
> Hello list,
> 
> I have used Bacula for while now for backing up several computers in
> our network and it works very well.
> I back up serveral linux servers and a few clients to a harddisk. 
> These are all either on the local network (relative to the director 
> and storage) or connected through high speed internet connections.
> I use priorities to first back up the servers, then the clients, 
> then back up the catalog and finaly I run a Admin Job (Type = Admin 
> in Job resource) which starts a shell script that syncs the harddisk
> containing all the volumes to an other harddisk. This last step of 
> course is to prevent data loss in case of a harddisk crash. Al the 
> jobs take up 2 hours max to complete.
> 
> Now the problem.
> I would like to back up a windows client which has 20 Gb of mostly 
> static data over a slow (internet) connection. Incremental backups 
> are very quick because of the fairly static nature of the data, but 
> doing a full backup would take a little more than 4 days. In 
> itsself, this is not a problem. I can do a full backup once a month 
> and do incrementals each day for instance. The problem is that 
> during these 4 days when the full backup runs, no other job can run.
> It blocks all other jobs.
> There is an option to run concurrent jobs which would solve the problem, 
but 
> this also works for jobs with the same priority. Now I need these 
> priorities to make sure the copy/sync of the complete storage from 
> hdd 1 to hdd 2 runs as the last job. Also the backup of the catalog 
> would need to run after all other backup jobs I presume.
> 
> The best solution for me would make it possible to leave the server 
> and storage on my local network, and the file daemon on the remote 
network.
> I have come up with some solutions but I am wondering if there's a 
> better way. Being able to  run concurrent jobs with different 
> priorities whould be a nice one :)
> Possible solutions.
> - Increase the bandwith. This is an option. The backup would then be
> the only process needing this resource. The obvious downside would 
> be the increase of the price for backups. Currently they run on a 
> simple consumer ADSL line. High upload bandwith would really 
> increase the monthly bill.
> - Local backup solution. The downside to this is decentralisation of
> administration and the loss of off-site backup advantage for this 
location.
> - Decrease the backup size. Not really easy. The data to be backed 
> up is somewhat of an unstructured mess and I would rather not touch 
> it if not nessecairy.
> - Maybe there's a solution in the concurrent jobs region but I 
> haven't found it yet.
> 
> Does anyone have an idea how to solve this efficiently or have any 
> experience with this type of problem?
> 
> Thanks for any response!
> 
> Maarten Hoogveld

A large, mostly static data set, and low bandwidth connection: this is 
rsync's specialty!  If you are not familiar with it, look here:

http://samba.anu.edu.au/rsync/

I suggest you use rsync to sync your Windows client to a copy at your main 
site (costs you 20GB of disk space somewhere) then do your full Bacula 
backup from there.

There will no doubt be some issues with timing/scheduling, and possibly 
permissions, to work out.  And restores will be two-stage.  But I think it 
would be worth it.


Glen

--
Glen Davison                      d...@sirca.org.au
SIRCA Pty Ltd                      Ph (02) 9236 9133
------------------------------------------------------------------------------
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to