Hi,

I tried to build bacula with S3 support but failed. This might be due to
the exact libs3 version needed [
https://sourceforge.net/p/bacula/mailman/message/36503142/].

As that didn't work I'm now focusing on sort of manually copying File
Volumes over to S3 as shortly described here [
https://blog.bacula.org/whitepapers/ObjectStorage.pdf]. But I wasn't able
to find any details on how to exactly do so (running a job which deletes
all "full" or "used" volumes for example).

I know how the SQL query would look like and will be able to write a bash
script doing the uploads to S3 and deleting volumes. But I would have
thought that there are more people doing similar things and do have proven
approaches for that at least.

My plan is to back up multiple machines onto a single storage daemon with
local file storage. (working, already).

For security reasons I'd like to copy all the Volume files immediately
after backup to an S3 bucket.
(I think that would just be a RunScript definition or the like with a bash
script doing an s3sync).

To keep the needed amount of disk space low I'd like to delete local
volumes after a specific scheme (like keeping a specific time range or
keeping the last full + incrementals) and only if the sync went fine.
That's the tricky part as I don't know how to select the volumes within a
schedule or a script without accessing the database directly.
And as we are talking about backups I don't want to experiment too much.

Am I missing something? I wasn't able to find a detailed description on how
to achieve that.

Ahh, I missed some details:

Bacula version: 9.4.4 (build from source)
OS: Amazon Linux 2

Kind regards,

Erik
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to