On Fri, 14 Oct 2005, Russell Howe wrote:
nathan r. hruby wrote:
Hi,
I have a small HA cluster using shared storage. The shared storage
mounting is handled by the cluster software and follows the services that
require the shared storage points. Obviously I'd like to back up this
shared storage.
Since these disks move around, I'd like to not just point bacula at each
box and backup everything including the shared storage, as then restores
become a problem (same files spread across multiple host jobs..)
My solution, I think, is to simply run a separate director configured for
the hostname of the HA service it is running and backup just the HA data.
I've been meaning to do this too..
It should be quite easy for me.. our cluster is a failover one, with
shared storage that's only mounted on one machine at any one time. It
has an IP address which gets failed over with the storage, so it should
just be a case of splitting the jobs (which is actually already done)
and pointing them at the clustered address instead of the machines'
'own' IP addresses.
If that makes sense...
I'd split your jobs into three:
Job {
FileSet = "Server"
Client = "server-a-fd"
...
}
Job {
FileSet = "Server"
Client = "server-b-fd"
...
}
Job {
FileSet = "SharedStorage"
Client = "cluster-fd"
...
}
Yes, but this requires a seperate bacula-fd for each HA service address
For example, imagine a setup like this:
boxA -> 1.2.3.1
boxb -> 1.2.3.2
serviceA -> 1.2.3.3
serviceB -> 1.2.3.4
serviceA and serviceB can float between boxA and boxB, thus both of those
machines will need to run 3 bacula-fd instances, one for the box, and one
for each service. You *could* set this up to only start the additional
bacula services when the node takes over the role, but I;m lazy and the HA
stuff we're using makes that kinda hard.
Doing a restore shouldn't be tricky really.. you can restore any job to
any client (well, unless you want to restore Windows files to a
non-Windows box and you didn't use the "Portable" directive)
Right if you do it the way above so the service is it's own client. If
you simply backup boxA and boxB with whatever data they happen to have
mounted, it's a PITA trying to figure out which which files were mounted
where when you need to restore.
I've set up things just like that and other than the fact that it's a
pain because we stunnel bacula it does work duckily.
Now if only I can confirm that bacula does the read for a File = "\\<
/foo/bar.list" for FileSets before executing Client Run Before Job (whihc
is not what I =want, but what I think happens, but I'm not sure :)
-n
--
-------------------------------------------
nathan hruby <[EMAIL PROTECTED]>
uga enterprise information technology services
production systems support
-------------------------------------------
-------------------------------------------------------
This SF.Net email is sponsored by:
Power Architecture Resource Center: Free content, downloads, discussions,
and more. http://solutions.newsforge.com/ibmarch.tmpl
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users