Hello bacula users,
I was wondering if the latest development 1.37 version of bacula has
support for executing more then one job at a time. In case if there are
multiple drives in the tap library, you should be able to record two
tapes in one go. Is this supported in 1.37 branch?
Many thanks,
An
On Sunday 03 July 2005 16:25, Andrei Mikhailovsky wrote:
> Hello bacula users,
>
> I was wondering if the latest development 1.37 version of bacula has
> support for executing more then one job at a time.
This has been possible for at least 4 years (in many versions).
> In case if there are
> mu
Well, the executing of multiple jobs doesn't work with my setup.
A short hardware spec: Overland tape library, 15 tapes, 2 DLT drives.
At the moment, I have a full backup running on a remote server writing
on Drive01. It would estimate to last for 3 days. Currenly the backup
process lasts for ab
On Thu, Jun 30, 2005 at 11:57:04AM +1000, Jesus Salvo Jr. wrote:
> I have a job that I just where I only ever wanted incremental level done.
>
> Reason is that, the fileset that is associated with the job only keeps the
> last N days of file to disk. That is, there is a cron job that deletes the
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Andrei Mikhailovsky wrote:
However, the local backup jobs are still wait for the remote one to
finish. So, I will have about 3 days of missed backup, which I can't
really afford.
Does anyone know what I am doing wrong?
Read up on 'Concurrent' jobs
Yeah, i've thought of that before i've setup the backup. I currently
have 10 backup jobs, like this:
# grep Concurrent *
bacula-dir.conf: Maximum Concurrent Jobs = 10
bacula-fd.conf: Maximum Concurrent Jobs = 10
bacula-sd.conf: Maximum Concurrent Jobs = 10
So this shouldn't be be an issue.
If
-BEGIN PGP SIGNED MESSAGE-
Hash: SHA1
Andrei Mikhailovsky wrote:
If I do the status of the Direct, I get the following:
Running Jobs:
Console connected at 02-Jul-05 19:18
JobId Level Name Status
==
>
> As far as I know (I may be wrong on this one), simultaneous jobs will
> only run when they have the SAME priority. In addition to that, the
> priority is only taken into account when multiple jobs start at the same
> time or get queued, once started, priority is ignored (ie a low priority
> j
Hi All,
Bacula version 1.36.3
I changed my FileSet from:
FileSet {
Name = "SybaseDump FileSet"
Include {
File = "/u03/sybbackup/syddbshared"
}
}
to:
FileSet {
Name = "SybaseDump FileSet"
Include {
File = "\\|find /u03/sybbackup/syddbshared -mtime -1"
}
}
Now to test i
Woops .. Sorry about the HTML mail. I pressed the wrong button awhile ago.
On Monday 04 July 2005 11:56, Jesus Salvo Jr. wrote:
> Hi All,
>
> Bacula version 1.36.3
>
>
> I changed my FileSet from:
>
> FileSet {
> Name = "SybaseDump FileSet"
> Include {
> File = "/u03/sybbackup/syddbshared
On Monday 04 July 2005 11:58, Jesus Salvo Jr. wrote:
> > Now to test it out, I ran:
> >
> > * estimate job=syddb280r-sybdump listing
> > Connecting to Client syddb280r-fd at 10.0.21.65:9102
> > 2000 OK estimate files=753 bytes=82,962,185,218
> >
> > Note that it says 753 files, but if I run:
> >
>
On Monday 04 July 2005 12:29, Jesus Salvo Jr. wrote:
> On Monday 04 July 2005 11:58, Jesus Salvo Jr. wrote:
> > > Now to test it out, I ran:
> > >
> > > * estimate job=syddb280r-sybdump listing
> > > Connecting to Client syddb280r-fd at 10.0.21.65:9102
> > > 2000 OK estimate files=753 bytes=82,962,
12 matches
Mail list logo