Hello All,
> I am using PoolUncopiedJobs with a RunAfterJob which executes a script
> when the job successfully finishes. The script inserts the job ID of
> a copied job into a table created specifically for that purpose.
> The original job ID can be passed to a shell script using "%I".
>
> This
On 2023-01-17 17:45, Bill Arlofski via Bacula-users wrote:
On 1/17/23 06:05, Ivan Villalba via Bacula-users wrote:
How can I run two differnet copy job that copies the same jobid with
the PoolUncopiedJobs ?
You can't.
The PoolUncopiedJobs does exactly what its name suggests: It copies
jobs
On 1/17/23 11:45, Bill Arlofski via Bacula-users wrote:
On 1/17/23 06:05, Ivan Villalba via Bacula-users wrote:
>
How can I run two differnet copy job that copies the same jobid with the
PoolUncopiedJobs ?
You can't.
The PoolUncopiedJobs does exactly what its name suggests: It copies jobs
Thanks Bill ,
I'm going to try something proposed in this paper:
https://bacula.org/whitepapers/ObjectStorage.pdf
So the idea is to upload the volume data on /bacula-storage/ to aws s3
using aws cli. I'm doing this as a workaround on the s3+objectlock issue,
and this solves the problem of the two
On 1/17/23 06:05, Ivan Villalba via Bacula-users wrote:
>
How can I run two differnet copy job that copies the same jobid with the
PoolUncopiedJobs ?
You can't.
The PoolUncopiedJobs does exactly what its name suggests: It copies jobs in a
pool that have not been copied to some other pool.
I
I have confirmed this behaviour:
The PoolUncopiedJobs select type for Copy jobs, determines that if the
second copy job will return not jobIDs to copy.
1) Backup job (Backup on main backup server's bacula SD), does a backup.
2) First copy job (Copy to 2nd backup server's bacula SD), does the copy
On 11/29/22 08:19, Ivan Villalba via Bacula-users wrote:
Hi there,
In order to follow the 3-2-1 backups strategy, I need to create a second copy type job to send backups to s3. The current
client definition have two jobs, one for the main backup (bacula server), and a copy type job that, using
Hi there,
In order to follow the 3-2-1 backups strategy, I need to create a second
copy type job to send backups to s3. The current client definition have two
jobs, one for the main backup (bacula server), and a copy type job that,
using the Next Pool directive in the original Pool, sends the back