On 06/01/2015 01:18 PM, Ana Emília M. Arruda wrote:
>>> I have this working here. With automatic labeling. When you submit the
>>> jobs, which messeges do you see for the jobs in the "status dir" command
A-ha, it says one job is running and 10 "is waiting on max Storage jobs".
>>> Just for check
On Mon, Jun 1, 2015 at 12:19 PM, Dimitri Maziuk
wrote:
> On 2015-05-30 11:21, Ana Emília M. Arruda wrote:
> > Hi Dimitri,
> >
> > I have this working here. With automatic labeling. When you submit the
> > jobs, which messeges do you see for the jobs in the "status dir" command
> > from bconsole?
On 2015-05-30 11:21, Ana Emília M. Arruda wrote:
> Hi Dimitri,
>
> I have this working here. With automatic labeling. When you submit the
> jobs, which messeges do you see for the jobs in the "status dir" command
> from bconsole? Usually the messeges there tell us something about why
> the jobs did
Hi Dimitri,
I have this working here. With automatic labeling. When you submit the
jobs, which messeges do you see for the jobs in the "status dir" command
from bconsole? Usually the messeges there tell us something about why the
jobs did not start concurrently and what for is the second job waiti
On 2015-05-28 21:43, Ana Emília M. Arruda wrote:
> Hi Dimitri,
>
> Maybe this makes no sense, but you said "On #1 I have max concurrent
> jobs = 10 in director and storage stanzas". Is this true for #2? Do you
> have concurrency enabled in director and storage of #2?
Yes. I started with identical
Hi Dimitri,
Maybe this makes no sense, but you said "On #1 I have max concurrent jobs =
10 in director and storage stanzas". Is this true for #2? Do you have
concurrency enabled in director and storage of #2?
Best regards,
Ana
On Thu, May 28, 2015 at 3:08 PM, Dimitri Maziuk
wrote:
> On 05/28/
On 05/28/2015 12:37 PM, Dimitri Maziuk wrote:
> ... because the spool in on the (smaller)
root
> drive.
PS. Looking at the logs, on #1 all 10 client jobs start within the same
couple of minutes after scheduled time and then backupcatalog job starts
2 hours later when they're done. Times are out
Hi all,
I have 2 disk-based bacula servers:
1. bacula-5.2.13 on x64 centos 6 with "single drive" vchanger -- i.e.
writing one volume at a time.
2. bacula-7.0.5 on x64 centos 7 with single filesystem storage.
On both servers job and attribute spooling are on.
On #1 I have max concurrent jobs =
Here is my scenario. I backup the clients to a disk-based storage
device. Then later that evening I have a COPY job that kicks off to
copy them to tape (LTO-5). I've set things up to run concurrent jobs
with spooling. The disk-based backups seem to run concurrent, but the
disk2tape copy job
On Monday 03 December 2012 16:34:01 lst_ho...@kwsoft.de wrote:
>
> Zitat von Silver Salonen :
>
> > On Thursday 29 November 2012 16:28:11 lst_ho...@kwsoft.de wrote:
> >>
> >> Zitat von Radosław Korzeniewski :
> >>
> >> > Hello,
> >> >
> >> > 2012/11/29 Silver Salonen
> >> >
> >> >> **
> >> >>
>
bacula-fd.conf (both on the server and the client)
FileDaemon {
Maximum Concurrent Jobs = X
}
bacula-sd.conf
Storage {
Maximum Concurrent Jobs = X
}
Device {
Maximum Concurrent Jobs = X
}
bacula-dir.conf
Director {
Maximum Concurrent Jobs = X
}
Job or JobDefs {
Maximum Concurrent Jobs
Zitat von Silver Salonen :
> On Thursday 29 November 2012 16:28:11 lst_ho...@kwsoft.de wrote:
>>
>> Zitat von Radosław Korzeniewski :
>>
>> > Hello,
>> >
>> > 2012/11/29 Silver Salonen
>> >
>> >> **
>> >>
>> >> On Thursday 29 November 2012 15:43:32 Radosław Korzeniewski wrote:
>> >>
>> >> Hello,
From: Jonathan Horne mailto:jho...@skopos.us>>
Date: Monday, December 3, 2012 9:19 AM
To:
"bacula-users@lists.sourceforge.net<mailto:bacula-users@lists.sourceforge.net>"
mailto:bacula-users@lists.sourceforge.net>>
Subject: [Bacula-users] concurrent jobs previous
On Mon, Dec 3, 2012 at 9:19 AM, Jonathan Horne wrote:
> A previous test build, i had concurrent jobs working, that my systems would
> backup 4 at a time. Now that I've taken it down and rebuilt it, I've
> obviously missed something because even tho i have:
>
> Director {
A previous test build, i had concurrent jobs working, that my systems would
backup 4 at a time. Now that I've taken it down and rebuilt it, I've obviously
missed something because even tho i have:
Director {# define myself
Name = bacula-dir
DIRport = 9101
On Thursday 29 November 2012 16:28:11 lst_ho...@kwsoft.de wrote:
>
> Zitat von Radosław Korzeniewski :
>
> > Hello,
> >
> > 2012/11/29 Silver Salonen
> >
> >> **
> >>
> >> On Thursday 29 November 2012 15:43:32 Radosław Korzeniewski wrote:
> >>
> >> Hello,
> >>
> >> 2012/11/29 Silver Salonen
> >
Zitat von Radosław Korzeniewski :
> Hello,
>
> 2012/11/29 Silver Salonen
>
>> **
>>
>> On Thursday 29 November 2012 15:43:32 Radosław Korzeniewski wrote:
>>
>> Hello,
>>
>> 2012/11/29 Silver Salonen
>>
>>
>> > sd1<--dir1 ---> client1 <--- dir2-->sd2
>>
>> Is your "client1" a Windows mac
Hello,
2012/11/29 Silver Salonen
> **
>
> On Thursday 29 November 2012 15:43:32 Radosław Korzeniewski wrote:
>
> Hello,
>
> 2012/11/29 Silver Salonen
>
>
> > sd1<--dir1 ---> client1 <--- dir2-->sd2
>
> Is your "client1" a Windows machine? If so, do you use VSS Enable = Yes in
> your Fil
On Thursday 29 November 2012 15:43:32 Radosław Korzeniewski wrote:
Hello,
2012/11/29 Silver Salonen
> sd1<--dir1 ---> client1 <--- dir2-->sd2
Is your "client1" a Windows machine? If so, do you use VSS Enable = Yes in your
FileSet resource?
Yes.
So, the current limitation of bacula-
Hello,
2012/11/29 Silver Salonen
> **
>
>
> > sd1<--dir1 ---> client1 <--- dir2-->sd2
>
> Is your "client1" a Windows machine? If so, do you use VSS Enable = Yes in
> your FileSet resource?
>
>
> Yes.
>
>
>
So, the current limitation of bacula-fd client on windows does not permit
concur
On Thursday 29 November 2012 15:06:28 Radosław Korzeniewski wrote:
Hello,
2012/11/29 Silver Salonen
On Thursday 29 November 2012 07:58:23 Dan Langille wrote:
> On 2012-11-29 07:02, Silver Salonen wrote:
> > Hi.
> >
> > I'm backing up some servers with multiple Bacula directors.
> >
> > Although
On Thursday 29 November 2012 08:27:16 Dan Langille wrote:
> On 2012-11-29 08:18, Silver Salonen wrote:
> > On Thursday 29 November 2012 07:58:23 Dan Langille wrote:
> >> On 2012-11-29 07:02, Silver Salonen wrote:
> >> > Hi.
> >> >
> >> > I'm backing up some servers with multiple Bacula directors.
>
Hello,
2012/11/29 Silver Salonen
> On Thursday 29 November 2012 07:58:23 Dan Langille wrote:
> > On 2012-11-29 07:02, Silver Salonen wrote:
> > > Hi.
> > >
> > > I'm backing up some servers with multiple Bacula directors.
> > >
> > > Although I've set "Maximum Concurrent Jobs = 10" in FD's
> > >
On 2012-11-29 08:18, Silver Salonen wrote:
> On Thursday 29 November 2012 07:58:23 Dan Langille wrote:
>> On 2012-11-29 07:02, Silver Salonen wrote:
>> > Hi.
>> >
>> > I'm backing up some servers with multiple Bacula directors.
>> >
>> > Although I've set "Maximum Concurrent Jobs = 10" in FD's
>> >
On Thursday 29 November 2012 07:58:23 Dan Langille wrote:
> On 2012-11-29 07:02, Silver Salonen wrote:
> > Hi.
> >
> > I'm backing up some servers with multiple Bacula directors.
> >
> > Although I've set "Maximum Concurrent Jobs = 10" in FD's
> > configuration,
>
> You also have to look at the SD
On 2012-11-29 07:02, Silver Salonen wrote:
> Hi.
>
> I'm backing up some servers with multiple Bacula directors.
>
> Although I've set "Maximum Concurrent Jobs = 10" in FD's
> configuration,
You also have to look at the SD and Dir 'maximum concurrent jobs'
settings as well.
> it seems that these
Hi.
I'm backing up some servers with multiple Bacula directors.
Although I've set "Maximum Concurrent Jobs = 10" in FD's configuration, it
seems that these multiple directors cannot run jobs concurrently - when one is
waiting for a new volume or something, the other just keeps waiting.
Anyone
On Wed, Aug 8, 2012 at 5:20 AM, Jummo wrote:
> Hi all,
>
> I have a question about data spooling in bacula.
>
> If a job runs and write directly to tape without spooling (e.g. a Copy Job)
> and pool "tape_full" on storage "autochanger1" (two tape drives) and a second
> job is started with spooli
Hi all,
I have a question about data spooling in bacula.
If a job runs and write directly to tape without spooling (e.g. a Copy Job) and
pool "tape_full" on storage "autochanger1" (two tape drives) and a second job
is started with spooling (e.g. Backup Job) and pool "tape_full" on storage
"aut
Le 08/06/2012 14:41, Julien Cochennec a écrit :
> Le 08/06/2012 13:54, Uwe Schuerkamp a écrit :
>> Maximum Concurrent Jobs = 20
> Ok, I'll do this, thanks. The default value is 1?
Great! It works! All Jobs together, thanks a lot!
---
Le 08/06/2012 13:54, Uwe Schuerkamp a écrit :
> Maximum Concurrent Jobs = 20
Ok, I'll do this, thanks. The default value is 1?
When you wrote :
> (the
> output of "stat dir" during a backup run should give you an idea
> what's blocking).
You mean bconsole, then status, then director? If so I alwa
On Fri, Jun 08, 2012 at 01:43:52PM +0200, Julien Cochennec wrote:
> >
> Hello Uwe,
> I have this element nowhere, where should I put it?
> My director has only include links, see below.
> I saw many posts about this parameter but it appears sometimes in
> device, sometimes in storage, sometimes in
Le 08/06/2012 11:21, Uwe Schuerkamp a écrit :
> Hello Julien,
>
> what are your settings for "Maximum Concurrent Jobs" for storage, sd,
> director, fd's and so on? What exactly is Bacula waiting for (the
> output of "stat dir" during a backup run should give you an idea
> what's blocking).
>
> All
On Fri, Jun 08, 2012 at 10:28:03AM +0200, Julien Cochennec wrote:
> Hi,
> New to this list and bacula newbie, here almost everything works great,
> backup around 50 clients, except one thing.
> I followed this example
> http://www.bacula.org/fr/dev-manual/Basic_Volume_Management.html#SECTION00122
Hi,
New to this list and bacula newbie, here almost everything works great,
backup around 50 clients, except one thing.
I followed this example
http://www.bacula.org/fr/dev-manual/Basic_Volume_Management.html#SECTION00122.
It's about concurrent disk jobs.
The problem is that altho
Hello Adrian,
thank you, it works. I've tested this two years ago and it doesn't work as
it should work. So I never had a look again about the priorities. Actually
I don't know if there was a bug in bacula or in my two years old config.
But I still use many parts of my old configuration where the
Hi Markus,
On Thu, Mar 08, 2012 at 09:09:22AM +0100, Markus Kress wrote:
> defined sequence. In other words: only the backup and verify jobs should
> run concurrently, the admin jobs in a defined sequence before and after all
> other jobs.
> admin job 1
> admin job 2
> admin job 3
> backup job cli
Op 8/03/2012 9:09, Markus Kress schreef:
Hello
I did not understand how Maximum Concurrent Jobs works. In special the
sentence "Note, this directive limits only Jobs with the same name as the
resource in which it appears ".
I try to describe what I want. I have some jobs of type admin. They have
Hello
I did not understand how Maximum Concurrent Jobs works. In special the
sentence "Note, this directive limits only Jobs with the same name as the
resource in which it appears ".
I try to describe what I want. I have some jobs of type admin. They have to
be run before backup jobs in a defined
On Tue, 2011-07-05 at 09:03 +0200, Uwe Mohn wrote:
> I'd suggest using different prioritys for your virtual fulls since
> bacula won't start a job with a higher bacula priority (means lesser
> importance)
> before a job with a lower bacula priority (means higher importance) has
> terminated. For
Hello,
usually I do not have problem with concurrent jobs, because they
read/write from/to different pools. But when it comes to virtual full
backups I need to stop bacula from doing these jobs concurrently.
Is there a way to tell the bacula to do a certain type of job one at a
time even if the c
>> I am not using spoolfile because of lack of disk space, i will see
>> performance this night without this file, cross my fingers.
>>
>
> When you use a backup to tape it is _always_ recommended to use a data
> spooling unless your local disks are slower then backup network and remote
> clients.
Hello,
2011/5/26 Andrés Yacopino
> Thanks for replying John.
>
> I am not using spoolfile because of lack of disk space, i will see
> performance this night without this file, cross my fingers.
>
>
When you use a backup to tape it is _always_ recommended to use a data
spooling unless your local
Thanks for replying John.
Well, because of your advice i have changed Storage part of bacula-dir
to 2 maximum Jobs, thanks for that.
I have also realized that having differents priorities on jobs avoid
them to run concurrently, so i have put the same priority to all the
jobs in this case 1 for all
> I am backing up 7 servers in daily basis, one of the servers take 5
> hours to backup (the other 6 take 2 hours), so i want to backup the
> large server at the same time i am backuping the others servers.
> I have one LTO4 drive, and the backups are running secuencially, one
> after another from
I am backing up 7 servers in daily basis, one of the servers take 5
hours to backup (the other 6 take 2 hours), so i want to backup the
large server at the same time i am backuping the others servers.
I have one LTO4 drive, and the backups are running secuencially, one
after another from differents
Thanks, I will try a concurrency of 2, combined with 500gb spooling, since
the LTO-4 drives are faster than the network.
-S
--
Special Offer-- Download ArcSight Logger for FREE (a $49 USD value)!
Finally, a world-class lo
> Hello,
> I'm completely lost regarding the Maximum Concurrent Jobs directive on
> multiple places.
>
> First, my setup:
> -Autochanger, about 70 slots
> -2 LTO4 Drives inside autochanger
> -2 backup Pools defined, and a Scratch pool. pool1 used by Job1 for unix
> clients, pool2 used by Job2 and
Hello,
I'm completely lost regarding the Maximum Concurrent Jobs directive on
multiple places.
First, my setup:
-Autochanger, about 70 slots
-2 LTO4 Drives inside autochanger
-2 backup Pools defined, and a Scratch pool. pool1 used by Job1 for unix
clients, pool2 used by Job2 and Job3 for window
'pedro moreno' wrote:
> Hi my friends.
>
> I have bacula running on my server with Centos x64 5.5
>Raid-5+LTO-2(Tandberg) external.
>
> My doubts are with bacula-sd concurrent jobs.
>
> U people that have disk based and tape backups, what is the maximum
>jobs are u running on disk or tape at the sa
Hi my friends.
I have bacula running on my server with Centos x64 5.5
Raid-5+LTO-2(Tandberg) external.
My doubts are with bacula-sd concurrent jobs.
U people that have disk based and tape backups, what is the maximum
jobs are u running on disk or tape at the same time(2,3,4,etc) on
each?
D
On Tue, 18 May 2010 12:01:02 -0700
Lampzy wrote:
> Right now the jobs are running one by one. I read all the documentation
> I can find and still can't figure out how to configure it to spool 4
> jobs simultaneously and de-spool them one by one to the tape drive.
I just de-spool them all simul
Hi folks,
I'm trying to setup concurrent jobs in Bacula 5.0.2 in the way that it
will spool multiple backup jobs and despool them one by one to tape.
All backup jobs run at the same time, have the same priority (10 by
default) and backup different clients. Spooling is enabled.
Right now the jo
2010/4/23 António Inês Silva :
>
> Good morning,
>
> Assuming a storage device supported on disk files, would this type of
> configuration output each concurrent jobs for the particular storage
> into a different file:
>
> Director configuration
>
> Storage {
> Name = DiskFilesStorage
> De
Good morning,
Assuming a storage device supported on disk files, would this type of
configuration output each concurrent jobs for the particular storage
into a different file:
Director configuration
Storage {
Name = DiskFilesStorage
Device = SomeDisk
MediaType = Files
Maximu
On Wednesday 24 February 2010 19:28:05 John Drescher wrote:
> > If volumes were files, there wouldn't be any need to limit them for
devices
> > which would be directories in that context.
> >
>
> Again the limit is only 1 volume can be loaded in 1 storage device at
> a time. This is not that big
On Wed, Feb 24, 2010 at 2:14 PM, Phil Stracchino wrote:
> On 02/24/10 08:07, Silver Salonen wrote:
>> On Tuesday 23 February 2010 19:09:49 Phil Stracchino wrote:
>>> On 02/23/10 06:32, Silver Salonen wrote:
I consider it a bug, but looks like devs do not. Any opinions?
>>>
>>> I ran into this
On 02/24/10 08:07, Silver Salonen wrote:
> BTW, this part is very obscure in the manual:
> "if you want two different jobs to run simultaneously backing up the same
> Client to the same Storage device, they will run concurrently only if you
> have
> set Maximum Concurrent Jobs greater than one i
On 02/24/10 08:07, Silver Salonen wrote:
> On Tuesday 23 February 2010 19:09:49 Phil Stracchino wrote:
>> On 02/23/10 06:32, Silver Salonen wrote:
>>> I consider it a bug, but looks like devs do not. Any opinions?
>>
>> I ran into this problem when I first upgraded to 3.0.3. It turned out
>> to be
> If volumes were files, there wouldn't be any need to limit them for devices
> which would be directories in that context.
>
Again the limit is only 1 volume can be loaded in 1 storage device at
a time. This is not that big of a limitation because with disk you
can have 1 storage devices if
On Wednesday 24 February 2010 17:03:58 Josh Fisher wrote:
> On 2/24/2010 9:25 AM, Silver Salonen wrote:
> > It's like assuming that the "ultimate" backup-devices are tapes. And as I
> > don't think that way, it's so annoying these design decisions rely on
> > somebody's (emotional/historical) opini
On Wed, Feb 24, 2010 at 9:25 AM, Silver Salonen wrote:
> On Wednesday 24 February 2010 15:58:57 John Drescher wrote:
>> > OK. I have never used tapes with Bacula. But I'd expect a file-type device
> to
>> > be able to load more than 1 volume at a time. It's quite trivial, isn't
> it?
>> >
>> This
On 2/24/2010 9:25 AM, Silver Salonen wrote:
> On Wednesday 24 February 2010 15:58:57 John Drescher wrote:
>
>>> OK. I have never used tapes with Bacula. But I'd expect a file-type device
>>>
> to
>
>>> be able to load more than 1 volume at a time. It's quite trivial, isn't
>>>
On Wednesday 24 February 2010 16:42:50 John Drescher wrote:
> On Wed, Feb 24, 2010 at 9:25 AM, Silver Salonen wrote:
> > What's the use of treating all the devices the same way anyway? Ease of
> > programming? Even though it makes this part of the whole project so rigid?
> >
>
> Ease of programmi
On Wednesday 24 February 2010 15:58:57 John Drescher wrote:
> > OK. I have never used tapes with Bacula. But I'd expect a file-type device
to
> > be able to load more than 1 volume at a time. It's quite trivial, isn't
it?
> >
> This was a design decision that all devices are treated the same way.
> Are you saying that for concurrent jobs to work I have to run these different
> jobs into the same volume? It doesn't make any sense in the means of disk-
> based backups.
>
If they are not the same volume then you need to have more than 1
storage device. Remember that only 1 volume can be loade
On Wednesday 24 February 2010 15:34:27 John Drescher wrote:
> On Wed, Feb 24, 2010 at 8:07 AM, Silver Salonen wrote:
> > On Tuesday 23 February 2010 19:09:49 Phil Stracchino wrote:
> >> On 02/23/10 06:32, Silver Salonen wrote:
> >> > I consider it a bug, but looks like devs do not. Any opinions?
>
> OK. I have never used tapes with Bacula. But I'd expect a file-type device to
> be able to load more than 1 volume at a time. It's quite trivial, isn't it?
>
This was a design decision that all devices are treated the same way.
>
> Anyway, the "1 volume at a time"-limit has always been "one job
On Wed, Feb 24, 2010 at 8:07 AM, Silver Salonen wrote:
> On Tuesday 23 February 2010 19:09:49 Phil Stracchino wrote:
>> On 02/23/10 06:32, Silver Salonen wrote:
>> > I consider it a bug, but looks like devs do not. Any opinions?
>>
>> I ran into this problem when I first upgraded to 3.0.3. It tur
On Tuesday 23 February 2010 19:09:49 Phil Stracchino wrote:
> On 02/23/10 06:32, Silver Salonen wrote:
> > I consider it a bug, but looks like devs do not. Any opinions?
>
> I ran into this problem when I first upgraded to 3.0.3. It turned out
> to be a configuration issue. Make sure you have th
On Wednesday 24 February 2010 13:38:17 Martin Simmons wrote:
> > On Wed, 24 Feb 2010 11:58:22 +0200, Silver Salonen said:
> > No, the default for devices has always been to allow only 1 job.
>
> That's not correct. Bacula has always been able to run multiple concurrent
> jobs to the same devi
> On Wed, 24 Feb 2010 11:58:22 +0200, Silver Salonen said:
>
> On Tuesday 23 February 2010 20:45:26 Martin Simmons wrote:
> > I think you have the concept backwards -- it is designed to prevent
> > concurrency on that device rather than allowing more of it.
> >
> > The default allows an unlim
On Tuesday 23 February 2010 19:09:49 Phil Stracchino wrote:
> On 02/23/10 06:32, Silver Salonen wrote:
> > I consider it a bug, but looks like devs do not. Any opinions?
>
> I ran into this problem when I first upgraded to 3.0.3. It turned out
> to be a configuration issue. Make sure you have th
On Tuesday 23 February 2010 20:45:26 Martin Simmons wrote:
> I think you have the concept backwards -- it is designed to prevent
> concurrency on that device rather than allowing more of it.
>
> The default allows an unlimited number of jobs to be queued (or run
> concurrently on a single volume).
I think you have the concept backwards -- it is designed to prevent
concurrency on that device rather than allowing more of it.
The default allows an unlimited number of jobs to be queued (or run
concurrently on a single volume). The new resource allows you to force jobs
to run on another "compat
On 02/23/10 06:32, Silver Salonen wrote:
> I consider it a bug, but looks like devs do not. Any opinions?
I ran into this problem when I first upgraded to 3.0.3. It turned out
to be a configuration issue. Make sure you have the desired level of
concurrency enabled in ALL applicable resources (i.
I consider it a bug, but looks like devs do not. Any opinions?
http://bugs.bacula.org/view.php?id=1508
--
Silver
On Tuesday 16 February 2010 09:56:21 Silver Salonen wrote:
> Hi.
>
> In 5.0 there is directive "Maximum Concurrent Jobs" for devices too, which
> should mean that it's now possible
I suppose this is true. Sorry, I can't help then.
-Original Message-
From: Silver Salonen [mailto:sil...@ultrasoft.ee]
Sent: 16 February 2010 09:00
To: bacula-users@lists.sourceforge.net
Cc: Beck J Mr
Subject: Re: [Bacula-users] concurrent jobs on the same storage
I have set ma
irst job has
> completed.
>
> James
>
> -Original Message-
> From: Silver Salonen [mailto:sil...@ultrasoft.ee]
> Sent: 16 February 2010 07:56
> To: bacula-users@lists.sourceforge.net
> Subject: [Bacula-users] concurrent jobs on the same storage
>
> Hi.
-Original Message-
From: Silver Salonen [mailto:sil...@ultrasoft.ee]
Sent: 16 February 2010 07:56
To: bacula-users@lists.sourceforge.net
Subject: [Bacula-users] concurrent jobs on the same storage
Hi.
In 5.0 there is directive "Maximum Concurrent Jobs" for devices too,
which s
Hi.
In 5.0 there is directive "Maximum Concurrent Jobs" for devices too, which
should mean that it's now possible to run multiple jobs simultaneously on the
same device and therefore on the same storage. Right?
I have "Maximum Concurrent Jobs = 20" set for SD, for 'storage-silver' and for
'dev
Hello
We have switched to Bacula from Backup Exec and being software developers
ourselves really appreciate Bacula and how simple it is to get going
quickly.
We have two autoloaders a TL4000 and a Dell PV132T each system has two
drives in it.
Therefore I am assuming that at any one time
T
Media Type = Ultrium1
Autochanger = yes
Maximum Concurrent Jobs = 2
}
-Original Message-
From: John Drescher [mailto:dresche...@gmail.com]
Sent: 01 November 2009 09:12
To: Brian Jobling; bacula-users
Subject: Re: [Bacula-users] Co
Hello
We have switched to Bacula from Backup Exec and being software developers
ourselves really appreciate Bacula and how simple it is to get going
quickly.
We have two autoloaders a TL4000 and a Dell PV132T each system has two
drives in it.
Therefore I am assuming that at any one time Bacula s
> We have switched to Bacula from Backup Exec and being software developers
> ourselves really appreciate Bacula and how simple it is to get going
> quickly.
>
> We have two autoloaders a TL4000 and a Dell PV132T each system has two
> drives in it.
>
Are the changers attached to different machines?
John Drescher wrote:
>> I'm trying to figure out why I can't run concurrent jobs on my
>> installation. The Director, the FileDaemon and the Storage are all set
>> as:
>>
>> Maximum Concurrent Jobs = 20
>>
>>
> There are 3 to 5 places you need this in bacula-dir.conf
>
> did you do that?
if not,
> I'm trying to figure out why I can't run concurrent jobs on my installation.
> The Director, the FileDaemon and the Storage are all set as:
>
> Maximum Concurrent Jobs = 20
>
There are 3 to 5 places you need this in bacula-dir.conf
did you do that?
John
---
Hi,
I'm trying to figure out why I can't run concurrent jobs on my
installation. The Director, the FileDaemon and the Storage are all set as:
Maximum Concurrent Jobs = 20
All the bacula setup resides on a single server: 2 SCSI tapes, data to
backup, and bacula progs. I have two jobs I would
2009/5/14 Jayson Broughton :
> I know this question has beened asked a million times on this list (And yes,
> I went through the nabble’s bacula-users archives over 2 days) but I think
> my situation is slightly alittle more unique. So if anyone could help me
> out, I would appreciate it!
>
>
>
>
I know this question has beened asked a million times on this list (And yes,
I went through the nabble's bacula-users archives over 2 days) but I think
my situation is slightly alittle more unique. So if anyone could help me
out, I would appreciate it!
Here's the background:
We have 150+ clie
Michael Zehrer wrote:
> Hi,
>
> I have a question about concurrent jobs. I have two types of backups running
> on different schedules. One is a tape backup, that includes several jobs with
> different priorities. The other is a disk backup that also includes different
> jobs. What I want is tha
Hi,
I have a question about concurrent jobs. I have two types of backups running on
different schedules. One is a tape backup, that includes several jobs with
different priorities. The other is a disk backup that also includes different
jobs. What I want is that the tape and the disk backup can
Hi,
16.04.2009 09:04, James Harper wrote:
> I am using one disk volume per job, and I would like to run multiple
> backups at once. This would require the storage daemon to have multiple
> volumes open at once.
Right. More in Bacula terms, you'd need several storage devices.
> Is this supported?
I am using one disk volume per job, and I would like to run multiple
backups at once. This would require the storage daemon to have multiple
volumes open at once. Is this supported? I would have guess not, but the
docs say "... This can be avoided by having each simultaneous job write
to a differen
On Mon, 6 Oct 2008, Kjetil Torgrim Homme wrote:
> > If you allow mixed priority jobs to run simultaneously, you also
> > need some way of flagging a job as "exclusive"
>
> this is accomplished by not setting Allow Mixed Priority on
> BackupCatalog.
OK, it wasn't clear in your earlier description
Alan Brown <[EMAIL PROTECTED]> writes:
> On Mon, 6 Oct 2008, Kjetil Torgrim Homme wrote:
>
>>This directive is only implemented in version 2.5 and later. When
>>set to {\bf yes} (default {\bf no}), this job may run even if lower
>>priority jobs are already running. This means a high
On Mon, 6 Oct 2008, Kjetil Torgrim Homme wrote:
>This directive is only implemented in version 2.5 and later. When
>set to {\bf yes} (default {\bf no}), this job may run even if lower
>priority jobs are already running. This means a high priority job
>will not have to wait for ot
Jason Dixon <[EMAIL PROTECTED]> writes:
> That was just an overview. Each Job is tied to a single client. I
> haven't been able to get this working properly yet; the lower
> priority jobs always "multiplex" (to use a NetBackup term)
> concurrently and force the higher priority job to wait.
My pa
On Tue, 12 Feb 2008, le dahut wrote:
> Does this mean that the second job (the one with a higher priority number)
> will be "forgotten" ? or that it will be run once the first job has finished
> ?
It will be run after the first job is finished.
Does this mean that the second job (the one with a higher priority
number) will be "forgotten" ? or that it will be run once the first job
has finished ?
Alan Brown a écrit :
> On Fri, 8 Feb 2008, le dahut wrote:
>
>> I'll use priority, since both jobs must be based on the same schedule
>> inc
1 - 100 of 143 matches
Mail list logo