Hi Dan,

Dan Langille wrote:
> What I do for large files such as database dumps: run pg_dump to get the 
> text file.  Then use rsync to update the local copy, then backup that.
>   

I was thinking along the same lines.  There is a problem with this 
strategy though.  At the very moment when your kit has fallen over and 
everybody is screaming at you to get it going again you have a 
complicated procedure to go through to restore it.  Not only do you have 
to format new drive(s), load the OS, setup the network, download bacula, 
build Bacula fd etc; you also have to remember where your local database 
backup file is kept or separately restore it from backups and find a way 
to transfer it to the new device before finally re-loading the database.

That's a lot of stress in an already stressful situation ;-)

Much better if Bacula could handle 'true' incremental backup of large 
files in a bandwidth friendly way.

- David.

-------------------------------------------------------------------------
SF.Net email is sponsored by:
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services
for just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to