Scott,

Here I am being difficult again, but my dir, fd, sd, and MySql are all 
on the same box. This is not a separate host type job. I run that job 
once a day, and as far as I can tell, it works fine.

The difference is that there a few different ways of running a script

RunScript
RunBeforeJob
ClientRunBeforeJob

The first and last are capable of kicking off a script on a separate 
host. Assume I have server1 as the bacula dir, sd, fd, MySql server. I 
then want to kick off a job that starts a script to mysqldump databases 
on another server, server2, that has its own MySql server and databases, 
and then return the exit code to the director so that it can start the 
real backup of the server2 client, which includes the output of the 
server2 script. That return part is what is sticking me. I got the 
impression that I might be able to put some kind of wrapper around the 
command on the server2 that would do just that, but haven't been able to 
make a go of it.

I am running MySql 3.23, so there may be some limitations on the 
utilities I have available for MySQL. I believe there are some new 
utilities that will dump across networks, etc in newer versions.

Thanks for the help.

Steve

Scott Ruckh wrote:
>> Thanks Scott,
>>
>> Can I assume the script does all of the waiting required before starting
>> the data backup? How would this differ from just setting up a shell
>> script that has a mysqldump statement in it? I don't read Perl real well
>> and didn't see anything that does that (at least based on my knowledge
>> of Perl) and just skimming the file.
>>
>> Sorry do be demanding, but this is the crutch of the problem I had
>> before. Any type of shell didn't return a code before timing out. The
>> manual addresses this, but in a couple of different ways. At the present
>> time, by doing the mysqldump from cron locally on the client, and using
>> logrotate to keep the directory clean, it works pretty well, but still
>> I'd like to get it working with the benefit of ClientRunBefore.
>>
>>     
>
> Yes, I use this script with bacula as mentioned before and it works great
> for me.  I keep one week of bzip2 files which contain a dump of each of my
> MySQL databases.  Each day I receive an e-mail which contains the output
> from this script (databases that were backed up, compressed file names,
> etc.  Again, this works great for me, but may not be right for everyone.
>
> Actually the make_catalog_backup script, which is used in the
> BackupCatalog Job {} definition, that comes with bacula is basically doing
> exactly what you are trying to accomplish. It uses mysqldump to dump the
> bacula database to a file.  That file is backed up, and then the file is
> deleted after the backup is completed.  Possibly you just want to take a
> look at the make_catalog_backup script and the BackupCatalog job for an
> example?
>
> I am sure there are many different processes people have implemented to
> accomplish this task, I was just mentioning the one I chose and works well
> for me.
>
> Good Luck.
>
>
>
>   


-------------------------------------------------------------------------
This SF.net email is sponsored by: Splunk Inc.
Still grepping through log files to find problems?  Stop.
Now Search log events and configuration files using AJAX and a browser.
Download your FREE copy of Splunk now >> http://get.splunk.com/
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to