Hello Ian,
The concept of schedule and start times are the same I guess. One is the
one that you set in your schedule resource and the other is the time that
the job starts running. Usually, they are slightly different.
Again, if you have duplicated jobs starting and you do not want this
situation, maybe you need to review your schedules.
It would help us to understand your issue if you could send here the job
and schedule definitions. Also, some outputs and log information are always
helpful in these cases.
There is a few possible configurations that can force bacula to run a full
instead of a differential or incremental depending on changes in the
FileSet. So have a look into your configurations would help us to
understand this.
The list of files to be backed up is build when the job starts and not when
the job is scheduled.
Best regards,
Ana
On Tue, Jul 19, 2016 at 3:52 PM, Ian Douglas <i...@zti.co.za> wrote:
> hi Ana
>
> On Tuesday 19 July 2016 15:37:05 Ana Emília M. Arruda wrote:
>
> > If you run a differential after a full, then an incremental, but the
> > differential one hadn't finished before the incremental one starts, then
> > the incremental would check the last full one. So you will have both
> > differential and incremental identical.
>
> thanks.
>
> But still not sure that's entirely correct.
> I rebuilt a NAS. I had full backup of original.
>
> After rebuilding, paths were slightly different, or at least the file times
> were, so when I re-allowed daily incrementals it effectively did a full
> backup.
>
> Now, about a month later, it did a differential, again effectively a full
> backup.
>
> While differential was running, two daily incrementals were scheduled. I
> draw
> a distinction between 'scheduled' and 'started', I'm not sure if you have
> the
> same difference in your usage.
>
> Anyway it ran two incrementals, both of which backed up the same data (as
> per
> files and filesize in log). Both started hours after the differential was
> finished, and the second over an hour after the first finished, with other
> jobs inbetween.
>
> My point is that the first incremental was done correctly, the second was
> an
> unnecessary duplication, because (it appears to me), it decided too soon
> what
> needs to be backed up.
>
> Alf said I need to look at the 'duplicate jobs' setting, but that sounds
> like
> a work-around rather than addressing the core issue, ie when does Bacula
> decide what needs to be backed up... when scheduled or when run. The second
> may be an exact duplicate when scheduled, but the data may have changed by
> run
> time.
>
> Thanks, Ian
>
> --
> i...@zti.co.za http://www.zti.co.za
> Zero 2 Infinity - The net.works
> Phone +27-21-975-7273
>
------------------------------------------------------------------------------
What NetFlow Analyzer can do for you? Monitors network bandwidth and traffic
patterns at an interface-level. Reveals which users, apps, and protocols are
consuming the most bandwidth. Provides multi-vendor support for NetFlow,
J-Flow, sFlow and other flows. Make informed decisions using capacity planning
reports.http://sdm.link/zohodev2dev
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users