> 
> This job could be of type Admin, which backs up nothing (but Admin
jobs
> will do a prune if required. This way, all your back jobs are done,
> and then
> the Admin job has a RunAfter ftp script...

Hmmm... that would be better, but...

> > In order to make sure that the job that the run-after script is
> > attached
> > to is really last, I have set Priority = 20 on that job.
> 
> Priority does nota affect speed.  Priority is used to determine which
> job runs
> first if there is more than one job waiting.

All the other jobs that back up to that storage are Priority 10 and so
will run before that last job, which is what I want.

> 
> >> From the docs, I assume that this means that that job will only run
> > after all the Priority = 10 (the default) jobs have finished, not
just
> > the jobs for that storage... is that right? Ideally I'd like for the
> > Priority = 20 job to start as soon as all the local Priority = 10
jobs
> > have finished... is there a way to do that?
> 
> If you make two jobs the same priority but different start times,
> they will run in
> start time order.   e.g.: 12:34 and 12:35....

With spooling though, a job that starts last may not finish last, which
is why I think I need something like the priority setting to make sure
that all the jobs to that storage have run before I upload the resulting
backup file to the NAS. Unfortunately it appears that that last job
won't run until _all_ priority 10 jobs have finished, including ones on
the other storage.

James

-------------------------------------------------------------------------
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2008.
http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to