I'm not sure two jobs concurrently will work -- I think the FD is
still single-threaded, although someone can correct me if I'm wrong.

My solution was to go to virtual full backups, so that full backups on the
client became a rare event. The heavy job then becomes the virtual full
consolidation, which is strictly a SD and director issue. My chokepoint for
consolidation jobs is currently attribute despooling, which thrashes the
database pretty hard, but it's still a lot faster than a full backup from
the client.

On Mon, Apr 27, 2020 at 9:34 AM Mark Dixon <mark.c.di...@durham.ac.uk>
wrote:

> Hi all,
>
> Am I right in thinking that a single bacula job can only back up each file
> in its fileset sequentially - there's no multithreading available to back
> up multiple files at the same time in order to leverage the client CPU?
>
> I'm a relatively long-term user of bacula (thanks!) who has been happy
> backing up relatively small data volumes to disk, but am now faced with a
> fairly large directory. "Large" is defined as "takes too long to do a full
> dump" and the limiting factor at the moment might be down to software
> compression on the client's CPU.
>
> Playing with the compression settings is the obvious approach, but I was
> wondering about other options - particularly as I may have a use case for
> client-side encryption as well.
>
> If the job stubbornly remains too long to backup, I suspect I'm looking at
> splitting the directory across multiple jobs and running them
> concurrently.
>
> Is that right?
>
> Thanks,
>
> Mark
>
>
> _______________________________________________
> Bacula-users mailing list
> Bacula-users@lists.sourceforge.net
> https://lists.sourceforge.net/lists/listinfo/bacula-users
>


-- 
David Brodbeck
System Administrator, Department of Mathematics
University of California, Santa Barbara
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to