In this email I will give some examples on how to send a command to
bconsole from within a shell script. I will attach one of my shell scripts
that *DOES NOT do what you need.* It is provided only as an example of how
a shell script could be made to interact with bacula. There is some error
checking functionality built in that could be useful.

At its most basic, a bconsole script looks like this:

#!/bin/bash
# bconsole bin and conf locations
bcbin="/opt/bacula/bin/bconsole"
bccfg="/opt/bacula/etc/bconsole.conf"

# commands I want to send to bconsole, echoed on one line with newline
characters,
echo -e "my first bconsole command\nmy second bconsole command\nquit\n" |
${bcbin} -c ${bccfg}
# end of script

Effectively you are echoing everything you would type into bconsole on one
line, separated by newline characters represented by \n. This input can of
course include variables.

You can schedule scripts to be ran by bacula jobs of type=admin, though
they will be subject to the usual priority limitations (so higher priority
jobs will be ran before a lower priority admin job unless you set "Allow
Mixed Priority = yes" in every job). You probably don't want to do that. I
think that in this case, an admin job waiting until all other jobs have
finished could avoid parsing the available jobs and volumes too soon. If
you don't want the script that does this work to be possibly delayed by
other jobs, you could schedule it in a cron job.

Here is my job that runs an admin script:
Job {
  Name = "admin-cloud-volume-part-sweeper-job"
  Type = admin
  Level = Full
  Schedule = "NightlyAfterCopy"
  Storage = "None"
  Fileset = "None"
  Pool = "None"
  JobDefs = "Synology-Local"
  Runscript {
     RunsWhen = before
     RunsOnClient = no  # Default yes, there is no client in an Admin job &
Admin Job RunScripts *only* run on the Director :)
     Command = "/opt/bacula/etc/cloud-volume-part-sweeper.sh" # This can be
`Console` if you wish to send console commands, but don't use that*
  }
  Priority = 30
}

The config parser requires that you must specify Storage, Fileset, Level,
and Pool, but the admin job doesn't use them. The values you specify for
those resources must all exist. I have implemented Bill A's dummy
resources, so I actually have empty resources with the name "None".

* Technically, an admin job does allow you to simply specify the commands
you want to give to bconsole without a script, using 'console', but this is
a bad idea because all your input is limited to what bconsole can do, and
because anything you send to bconsole via 'console' in an admin job is
entered into the bacula log under jobid 0. This makes it hard to find out
what an admin job did later.
By default, admin jobs input all the output from a script into the joblog
of the admin job that ran the script.

I have attached my cloud volume part sweeper script. *This script doesn't
do what you need, but maybe you could adapt it to your purposes.* I wrote
this script because bacula doesn't appear to throw any errors or fail jobs
if a cloud volume part fails to upload. bconsole specifically doesn't
return any exit code if there were any issues during upload. Basically,
structurally, there is no way for bacula or bconsole to put a job in an
error or even 'ok with warnings' status if cloud volume parts fail to
upload. However, if a script ran by an admin job returns 1 on exit, then
the admin job is listed by bacula as 'failed'. You probably don't need the
last two error checking if statements, but all the other error checking
could be useful. Even if it isn't useful, hopefully you can use this as a
framework to get the information you need programmatically.

My script is set up to:
So a bunch of sanity checks and some script-specific log rotation
Interact with bconsole, logging all output from bconsole in a file. Past
instances of this logfile were rotated in the previous step, so this
logfile only contains output from this session.
Analyze the script's logfile after all bconsole work is done, checking for
failure states like a logfile that doesn't exist or can't be read, then
checking the log for keywords that indicate an error.
If anything bad is detected, then the script exits 1, and the admin job
fails.

I say all this to explain that most of the script is housekeeping and
safety checks. The actual bconsole work is very simple.

My part sweeper script is attached. I hope you find it useful as an example
of what you can do with bconsole through scripts.

Regards,
Robert Gerber
402-237-8692
r...@craeon.net

Attachment: cloud-volume-part-sweeper.sh
Description: application/shellscript

_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to