I see you are running the job on the
Client. On the client, the Job is finished so the script is run,
but in the SD and Dir, the Job is still running. Though it is a
bit odd and perhaps confusing, Bacula is functioning as it was
designed.
Best regar
Hello Kern,
thank you for your answer. Here is my job definition:
Job {
Name = "monitor-host.logs"
JobDefs = "DefaultJob"
Storage = EAST
Schedule = "Early"
Write Bootstrap = "/opt/bacula/var/bacula/working/monitor-host.bsr"
Client = monitor-host
FileSet = "monitor-host.logs"
RunS
Hi folks,
I tried the new "stop / resume" functions on our bacula server for the
first time today (7.4.4 / MariaDB / CentOS6 compiled from source).
While stop seemed to work ok and left the job in an "incomplete" state
after finishing the "spooling attributes" bit, "resume" just sat there
for an
Hello,
When a Bacula backup job terminates, all the File table entries are
already in the catalog. So about the only thing that makes any sense is
that you are running the script before the backup completes. Perhaps by
simply adding a
"wait" just before your llist, would solve the problem. H
Hi guys,
I'm using Bacula 7.4.4 and got a script which runs after every backup
and expects a list of file names which were backed up. Now I noticed
that bacula is writing the files to the Catalog at the end of the backup
process, thus the script receives an empty list and doesn't run
properly.