Hello again,
Incase anyone else ever needs it, this line will create a full sql dump
of all databses from the local SQL server:
RunBeforeJob="cmd /k mysqldump -uUSER
-pPASS --all-databases -r C://DIR//FILE.sql"
nb. you must create an empty text file eg. FILE.s
Thanks for the hellp guys!
Alan Brown wrote:
mysqlhotcopy can be used to copy out the tables in binary format if that's
really wanted, but my experience is that the mysqldump table compresses
_extremely_ well on tape.
Thanks Alan, I will look into that also. Though the db is not huge, and m
Alan Brown wrote:
> On Tue, 5 Feb 2008, John Drescher wrote:
>
>>> @echo off
>>> xcopy /y /q /k C:\MySQL\data\bacula E:\backup\bacula-database\
>>> exit
>>>
>> Isn't there a mysqldump for windows? I do not think this is a good way
>> of backing up a database that is in use.
>
> It isn't.
>
>
On Tue, 5 Feb 2008, John Drescher wrote:
>> @echo off
>> xcopy /y /q /k C:\MySQL\data\bacula E:\backup\bacula-database\
>> exit
>>
> Isn't there a mysqldump for windows? I do not think this is a good way
> of backing up a database that is in use.
It isn't.
mysqlhotcopy can be used to copy out
I have done some test fews days ago, try this use this shape:
Job
{
...
ClientRunBeforeJob = "cmd /k c:\\programmi\\bacula\\scripts\\before.bat"
...
}
Important things:
- "cmd /k"
- doubled slashes "\\"
Cesare Montresor
[EMAIL PROTECTED] wrote:
>
> I am pretty sure the xcopy scrip
I am pretty sure the xcopy script IS failing. I just can't figure out why.
If I use my DEL script in place of the xcopy script, it works like a
charm. Yet, if I run the xcopy batchfile manually it also completes
flawlessly. I would assume this is a permissions thing, but bacula was
installed
> I apologise for the confusion.
>
Now I think I confused you.
>
> Thanks for the help though, I would have been testing everything twice
> otherwise! lol
>
Are you sure that the xcopy did not fail? I think the exit command
returns 0 regardless of the success or failure of the xcopy. Or am I
w
I apologise for the confusion.
Thanks for the help though, I would have been testing everything twice
otherwise! lol
Dan
John Drescher wrote:
However, I have been testing with simple files, so I didn't put the db at
risk. It seems that the targeted script is simply not run at all, the
> However, I have been testing with simple files, so I didn't put the db at
> risk. It seems that the targeted script is simply not run at all, the result
> would be the same with any script wouldn't it?
>
I was under the assumption that the xcopy command failed because the
files that it was tryi
Thanks :)
If the director is the machine with the database then you can use
RunBeforeJob otherwise you have to use ClientRunBeforeJob.
Awsome! I could not find that anywhere :)
Isn't there a mysqldump for windows? I do not think this is a good way
of backing up a database that is in use
On Feb 5, 2008 8:21 PM, [EMAIL PROTECTED]
<[EMAIL PROTECTED]> wrote:
>
> Hi Guys
>
> I am officially stumped.
>
> I am trying to run a script before the job to copy the bacula db before
> backing it up. The batch script works when its clicked. But, whether I use
> the 'ClientRunBeforeJob' not
Hi Guys
I am officially stumped.
I am trying to run a script before the job to copy the bacula db before
backing it up. The batch script works when its clicked. But, whether I
use the 'ClientRunBeforeJob' not 'RunBeforeJob', the files are not
copied, despite a successful report from the run:
12 matches
Mail list logo