On 06/28/11 06:56, James Harper wrote:
> I want to back up some data over a potentially unreliable link. The
> ongoing incremental data to back up will be fairly small and the job
> will run as a perpetual incremental using VirtualFull to synthesize a
> full backup once a week or so. The initial data though will be 10GB and
> will take upwards of 4 days to complete at 1mbit/second, which would
> saturate the uplink - and I'd hate to throw away 3 days of backup just
> because the link dropped.

Adding a data limit to a Bacula job really won't do a lot to work around
the unreliability of the link, it'll just make the job terminate early
if you COULD have completed it in one shot.  I'm not sure this idea
makes sense.

The usual recommendation for situations like this, backing up over a
link or limited speed and questionable reliability, is to mirror the
backup source using rsync (which already has rate-limiting options, and
can cleanly resume in case of interruption), then back up the mirror.

-- 
  Phil Stracchino, CDK#2     DoD#299792458     ICBM: 43.5607, -71.355
  ala...@caerllewys.net   ala...@metrocast.net   p...@co.ordinate.org
  Renaissance Man, Unix ronin, Perl hacker, SQL wrangler, Free Stater
                 It's not the years, it's the mileage.

------------------------------------------------------------------------------
All of the data generated in your IT infrastructure is seriously valuable.
Why? It contains a definitive record of application performance, security 
threats, fraudulent activity, and more. Splunk takes this data and makes 
sense of it. IT sense. And common sense.
http://p.sf.net/sfu/splunk-d2d-c2
_______________________________________________
Bacula-devel mailing list
Bacula-devel@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-devel

Reply via email to