Hi Bill,

thank you very much for your detailed answer. Reading the details from your 
answer,
everythings seems convincingly.

I have discussed the issue yesterday with an other admin/devops. I think I've 
stressed out the wrong
point - sorry! We think our issue is:

zmbackup (native backup of zimbra) doesn't run in foreground, it immediately 
gets to background after starting.This prevents bacula from
waiting for completion.
I've extended the parameters with "-sync", this lets it run foreground.

First tests I have made in bacula ran good (couldn't test the whole backup 
because it generates Load on the Zimbra Server
and users are working with Zimbra).

Thank you very much!

Bruno


-----Ursprüngliche Nachricht-----
Von: Bill <bacula-users@lists.sourceforge.net>
An: bacula-users <bacula-users@lists.sourceforge.net>
Datum: Donnerstag, 20. März 2025 00:41 CET
Betreff: Re: [Bacula-users] Bacula + Zimbra: Run Before Backup


On 3/19/25 5:08 AM, Bruno Bartels \(Intero Technologies\) via Bacula-users 
wrote:
> Hi all,
> I am fiddeling with a problem. I am trying to set things up like mentioned 
> here:
> https://www.bacula.lat/zimbra-network-edition-backup-with-bacula/?lang=en 
> <https://www.bacula.lat/zimbra-network-edition- 
> backup-with-bacula/?lang=en>
> This means: Bacula (Dir, SD) runs on one server, Zimbra (and bacula-fd) runs 
> on annother server.
> Backup on Zimbra-Server should be done with Zimbra's native "zmbackup"-tool. 
> When this finishes, Bacula should do it's own 
> backup and after that the native
> Zimbra Backup should be cleaned up again by zmbackup.
> For archeiving this, there are those in my Job-Config:
> Runscript {
>      RunsWhen = "Before"
>      Command = "/etc/bacula/scripts/bacula_zimbra.sh %l"
>    }
>    Runscript {
>      RunsWhen = "After"
>      Command = "/etc/bacula/scripts/bacula_zimbra_delete.sh"
> This doesn't work unfortunately: Because "RunsOnClient" is used, the Dir can 
> never determine, when zmbackup has finished, so 
> he starts the native Bacula-backup
> even when zmbackup still runs.
> Has anybody an idea please how to get out of this?
> Not using "RunsOnClient" doesn't seem to by an option. Also I have tried 
> several options to catch the status of zmbackup in 
> the scripts, but this wont notify the Dir.
> Thank you in advance!
> Bruno

Hello Bruno,

Running script (Runswhen = before) on a client does work as expected, and the 
moving of data (the Backup) will not start 
until that script exits, and then, the after script will only run once the data 
is finished being backed up.

Take a look:


I added the following RunScript resource to one of my simple backup jobs:
----8<----
Runscript {
   RunsWhen = before
   RunsOnClient = yes    # This is the default, but I prefer to be explicit
   FailJobOnError = yes  # Same as above
   Command = "/opt/comm-bacula/include/scripts/sleep.sh 90"
}
----8<----

Here is the simple `/opt/comm-bacula/include/scripts/sleep.sh` script that 
sleeps for 30 seconds by default, unless some 
other number is passed on the command line (we will pass 90 on in the RunScript 
above):
----8<----
#!/bin/bash

# waa - 20250319 - Just a simple sleep script for generic testing
# ---------------------------------------------------------------
secs="${1:-30}"
echo "Sleeping ${secs} seconds..."
sleep ${secs}
----8<----



Notice that the script starts at 17:18:23, then, only after 90 seconds, at 
17:19:53 does the backup job continue on its way:
----8<----
time: 2025-03-19 17:18:23
logtext: bacula-fd JobId 75599: shell command: run ClientBeforeJob 
"/opt/comm-bacula/include/scripts/sleep.sh 90"

time: 2025-03-19 17:18:23
logtext: bacula-fd JobId 75599: ClientBeforeJob: Sleeping 90 seconds...

time: 2025-03-19 17:19:53
logtext: bacula-dir JobId 75599: Sending Accurate information to the FD.

time: 2025-03-19 17:19:58
logtext: bacula-sd JobId 75599: 3307 Issuing autochanger "unload Volume 
c0_0007_0004, Slot 4, Drive 6" command.
----8<----


Your pasted configs seem OK at first glance. What I would do is temporarily add 
some echo commands in both of them to see 
when/if they are even running. Make sure you did not make the same mistake I 
just made too... Is it possible that you didn't 
reload the Director configs? In that case neither script woulkd even run if the 
Director did not know about them. :)

A bconsole command:
----8<----
* show job=xxxx
----8<----

Will show you if the Director knows about the new RunScripts in this job:
----8<----
[...snip...]
--> RunScript
   --> Command=/opt/comm-bacula/include/scripts/sleep.sh 90
   --> Target=%c
   --> RunOnSuccess=1
   --> RunOnFailure=0
   --> FailJobOnError=1
   --> RunWhen=2
[...snip...]
----8<----


The silly mistake I made was that I wrote the script on one system, then added 
the RunScript to a job that runs on a 
different system (ie: a system that did not have the sleep.sh script lol)

The problem when I did that was instantly obvious though (ie: No such file or 
directory):
----8<----
19-Mar 17:16 x1carbon-fd JobId 75598: shell command: run ClientBeforeJob 
"/opt/comm-bacula/include/scripts/sleep.sh 90"
19-Mar 17:16 x1carbon-fd JobId 75598: Error: Runscript: ClientBeforeJob 
returned non-zero status=208. ERR=No such file or 
directory
19-Mar 17:16 bacula-dir JobId 75598: Fatal error: [DE0031] Bad response to 
RunBeforeNow command: wanted 2000 OK RunBeforeNow
, got 2905 Bad RunBeforeNow command.
----8<----


Hope this helps,
Bill

-- 
Bill Arlofski
w...@protonmail.com
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users


_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to