On Fri, December 14, 2007 10:07 am, David Legg wrote:
> Hi Dan,
>
> Dan Langille wrote:
>> What I do for large files such as database dumps: run pg_dump to get the
>> text file.  Then use rsync to update the local copy, then backup that.
>>
>
> I was thinking along the same lines.  There is a problem with this
> strategy though.  At the very moment when your kit has fallen over and
> everybody is screaming at you to get it going again you have a
> complicated procedure to go through to restore it.  Not only do you have
> to format new drive(s), load the OS, setup the network, download bacula,
> build Bacula fd etc; you also have to remember where your local database
> backup file is kept or separately restore it from backups and find a way
> to transfer it to the new device before finally re-loading the database.
>
> That's a lot of stress in an already stressful situation ;-)

This is why you have written a document that outlines the steps to be
performed, including database restore.  You have also given it a few
dry-runs, just to be sure you have not missed anything.

> Much better if Bacula could handle 'true' incremental backup of large
> files in a bandwidth friendly way.

Standard response to such things: clearly, it is not important enough to
anyone, otherwise, they would have coded it by now.  ;)


-- 
Dan Langille - http://www.langille.org/


-------------------------------------------------------------------------
SF.Net email is sponsored by:
Check out the new SourceForge.net Marketplace.
It's the best place to buy or sell services
for just about anything Open Source.
http://ad.doubleclick.net/clk;164216239;13503038;w?http://sf.net/marketplace
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to