On Tue, Jun 28, 2011 at 08:40:47AM -0400, John Drescher wrote: > On Tue, Jun 28, 2011 at 8:27 AM, Phil Stracchino <ala...@metrocast.net> wrote: > > On 06/28/11 06:56, James Harper wrote: > >> I want to back up some data over a potentially unreliable link. The > >> ongoing incremental data to back up will be fairly small and the job > >> will run as a perpetual incremental using VirtualFull to synthesize a > >> full backup once a week or so. The initial data though will be 10GB and > >> will take upwards of 4 days to complete at 1mbit/second, which would > >> saturate the uplink - and I'd hate to throw away 3 days of backup just > >> because the link dropped. > > > > Adding a data limit to a Bacula job really won't do a lot to work around > > the unreliability of the link, it'll just make the job terminate early > > if you COULD have completed it in one shot. I'm not sure this idea > > makes sense. > > > > I think the idea is to terminate early without error. So that the next > incremental ... can pickup where the full left off.
Surely the only difference between an interrupted incremental and a completed incremental is that the most recently transmitted file in the former might be broken. So, I don't see why bacula can't just treat an interrupted incremental as a finished incremental, except for the single file that was being transmitted at the point of interruption. Then the next incremental that runs doesn't have to backup the same files again. ------------------------------------------------------------------------------ All of the data generated in your IT infrastructure is seriously valuable. Why? It contains a definitive record of application performance, security threats, fraudulent activity, and more. Splunk takes this data and makes sense of it. IT sense. And common sense. http://p.sf.net/sfu/splunk-d2d-c2 _______________________________________________ Bacula-devel mailing list Bacula-devel@lists.sourceforge.net https://lists.sourceforge.net/lists/listinfo/bacula-devel