> Heitor Faria <hei...@bacula.com.br> kirjoitti 18.7.2018 kello 20.42:
> 
> Dear Users,
> 
> I'm planning to deploy a Copy Job for Geographical Redundancy Disaster 
> Recovery (Site A, Site B).
> Failover site (B) has a secondary Bacula Director, Catalog and Storage Daemon.
> Do you think it is possible to perform Copy jobs from Site A to Site B, using 
> the failover Catalog as the metadata repository?
> 

Hello

As far as I know a Copy job is not possible to be running on two storage 
daemons. That is what I learned when I had a need for it.

I implemented a solution using two pools with different folders for the files, 
and the ”remote” folder being an nfs mount to a remote site. Then I created a 
After Job Script creating symlinks to both folders referring to media files in 
the other folder.

I had only one Director, so using two Dirs needs some another work around. For 
example you could bscan somehow in the remote site the media into the local dir.

All this is absolutely not ideal for your purpose, but I guess something like 
this is needed with Bacula for this.

Maybe Bacula Systems will develop something this enterprisey in the futuree, 
only time will tell.

br. jarif


Attachment: signature.asc
Description: Message signed with OpenPGP

------------------------------------------------------------------------------
Check out the vibrant tech community on one of the world's most
engaging tech sites, Slashdot.org! http://sdm.link/slashdot
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to