Marc wrote:
> I have to maintain a server park with 500+ databases, and databases being
> remove and added every day. So defining a job for each database is not an
> option.
>
> Creating a backup of the database files is not what I'm looking for, as I
> will be unable to restore Inno databases. Inno databases are stored in one
> file, not separate files / directories as MyISAM.
>   
First off, you shouldn't need to define more than one job per machine. 
Your run-before-job script(s) should be plenty able to determine which 
databases to dump. Real quickly, this is how my experience has gone with 
PostgreSQL databases.

1. Higher ups have said no downtime is allowable, so I cannot simply 
shut down the database server, back up the raw working files, and 
restart it. I cannot back up these raw files while it is running because 
I would never get reliable data as it's changing them constantly.

2. There is a bit of a bug in the run-before-job stuff where if your 
scripting takes more than 30 minutes then the remainder of the job 
(actual data transfer, run-after-job) stuff does not run and the job is 
marked as an error. This has to due with certain connection timeouts 
between the FD and the SD. You could make a source change and recompile 
Bacula to increase this limit, but I didn't want to do that.

3. My set-up involves a few dozen machines each with anywhere from 50 to 
300 databases on them.

4. First I tried using FIFOs, but ran into various issues. I would not 
recommend them unless you are critically short on disk space on your 
servers, since they bring their own complications. (If you do decide to 
try FIFOs, let me know, I have some scripts from then which may be helpful.)


Here's what I do currently. I have a "DB" jobdef and a "DBPREP" jobdef. 
The DBPREP stuff uses a client-run-after-job script (to avoid that 30 
minute timeout issue) to actually do the dumping of the databases into 
files on the disk. The DB jobs have a higher Priority= number, so they 
run after all DBPREP jobs are done. DBPREP scripts change a little file 
called "rval.dat" every time they finish and that file stores whether 
the DBPREP job actually ran OK or not. The DB client-run-before-job 
scripts check that file before doing the real backup and cancel the DB 
job if the previous DBPREP didn't work out.

On "full" backups, the scripts I have on each client dump and compress 
each database into dbname.sql.gz, and then delete the older files. On 
"differential" or "incremental" days, the script dumps each database 
into dbname.YYYYMMDD.gz. Then, it computes a differential between that 
and the original dbname.sql.gz file. The differential is 
dbname.YYYYMMDD.diff Finally, it dbname.YYYMMDD.gz and keeps the 
differential. I use "xdelta3" for generating diffs since the default 
"diff" tool tends to break very badly when handling large files--it uses 
too much memory.

--Darien Hager

-- 
Darien Hager
[EMAIL PROTECTED]


-------------------------------------------------------------------------
This SF.net email is sponsored by DB2 Express
Download DB2 Express C - the FREE version of DB2 express and take
control of your XML. No limits. Just data. Click to get it now.
http://sourceforge.net/powerbar/db2/
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to