Hi all,

Am I right in thinking that a single bacula job can only back up each file in its fileset sequentially - there's no multithreading available to back up multiple files at the same time in order to leverage the client CPU?

I'm a relatively long-term user of bacula (thanks!) who has been happy backing up relatively small data volumes to disk, but am now faced with a fairly large directory. "Large" is defined as "takes too long to do a full dump" and the limiting factor at the moment might be down to software compression on the client's CPU.

Playing with the compression settings is the obvious approach, but I was wondering about other options - particularly as I may have a use case for client-side encryption as well.

If the job stubbornly remains too long to backup, I suspect I'm looking at splitting the directory across multiple jobs and running them concurrently.

Is that right?

Thanks,

Mark


_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to