Hello,

śr., 12 paź 2022 o 21:37 Robert M. Candey <can...@acm.org> napisał(a):

> I've been using Bacula to back up many servers and desktops to a tape
> library since early on, but always had one server running all of the Bacula
> processes except for the individual file servers.
>
I assume you are familiar with architecture where every Bacula's component
can be separated from each other while maintaining a working network
connection between them.

>
> I'm setting up a new tape library and have new data servers, so I'm
> wondering if there is a more efficient architecture for backing up 1PB,
> mostly stored on one server and NFS-mounted to the other data servers.
>
> Does it make sense to run the PostgreSQL database server and storage
> servers on their own servers dedicated to Bacula?
>
It is always a good practice to separate different workloads for
performance reasons.

> Is there value in running the Director on one or the other?
>
In most cases users run all Bacula components in one server. When
performance scaling is required then Director and Catalog goes to the
separate machine and another Storage machine is added.

> Should I continue to run the storage daemon on the server that hosts the
> large data?
>
It is highly recommended to _NOT_ run Storage daemon on a machine that
hosts data to backup unless you backup to tape storage and you are an
expert which knows exactly what he is doing.

> I'm thinking that the NFS server might be more efficient if run on its
> own, and transfer its data over the network (100GbE) to the Bacula storage
> server attached to the tape library.  And perhaps PostgreSQL could have
> dedicated memory and CPU. I don't know what if anything is slowing down our
> backups.  Full backups take 4-6 weeks for 500 TB now.
>
If you backup 500TB for about 1,5 month then depending on requirements, you
should optimize your setup ASAP. You can optimize in a dozen different ways.
First (as always) you need to find your bottleneck! The point where all
your backup slows down. It could be a network, CPU, memory, database (when
not using attrib spooling and you backup a massive number of files), data
source, etc, etc. Doing optimization without backup flow analysis is a
waste of time and effort.

best regards
-- 
Radosław Korzeniewski
rados...@korzeniewski.net
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to