Hi!

On Sunday 26 August 2007, James Harper wrote:
> I had the following configuration:
>
> "
> Director
>  Max Concurrent Jobs = 1
>
> Job1
>  Client = Client1
>  Storage = Storage1
>
> Job2
>  Client = Client2
>  Storage = Storage2
> "
>
> Then it occurred to me that it would be perfectly reasonable to run 2
> jobs at the same time if they were both using completely different
> Client and Storage resources, so I set Max Concurrent Jobs = 2 in the
> Director and issued a 'reload'. But Job1 and Job2 don't seem to run
> concurrently. If I start Job1 and then Job2, Job2 sits at "waiting
> execution".
>
> My questions are:
>
> 1. Should Job1 and Job2 run concurrently after the configuration change,
> or is there something I've overlooked?
>
> 2. Is a reload sufficient, or do I need to restart the director to make
> this change take effect?

It seems that you have to hard restart the director when changing the director 
concurrency, a simple reload did not work in my case (although the status 
command _does_ report the new concurrency). I didn't dig in to verify if you 
could achieve it any other way, but at least this worked for me :-)... and 
you should check, if the daemon was really stopped by checking `pidof 
bacula-dir`. At least in my case, the Debian init script didn't work and left 
the daemons running untouched...

I think this is a bug (at least a documentation bug :-) ), but I am not sure 
and don't have the time to go into details with this.

Hope it helps,
  Andreas

-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >>  http://get.splunk.com/
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to