I am, but not in a way that makes much difference to you.

I have a script that moves backup disk volumes in a particular directory
that are older than a certain number of days to Glacier; the volumes in
that directory get populated by migration jobs from the primary pool.

I'm skeptical about doing too much more for a couple of reasons. First,
because I've found that I'm more comfortable dealing with Bacula's
functionality from arm's length. I haven't had much luck, personally, with
things like vchanger.  I'm a bit wary about trying to automate in a fashion
that is too tied up in how things are currently implemented.

Second, I looked at my own use cases for recovery scenarios, and felt that
locally-cached backup sets were a better fit for me, given the constraints
of bandwidth and costs. I did some calculations on the cost of recovering
significant data from Glacier, and depending on your time-to-restore
requirements it can be _really_ expensive.  I'm in the process of adding
code to my scripts to lengthen the recovery request submissions, but if
there is a need for many GB of data from Glacier _now_ it might cost
several hundred dollars.

I really think of Glacier as "cold" storage - secondary, in my case to
locally stored volumes. It's great for that, but again, you really want to
work out the costs before committing to using it with any expectations for
actual restore processing.

N. B.  one thing I have not looked at is using one of the available S3
mountable filesystem implementations as primary storage and setting up a
process to move data from S3 to Glacier using the AWS life cycle tools,  if
I were starting from scratch today, I'd look there first.

Hope this helps, feel free to reply directly to me if there is anything I
can amplify.

_KMP

On Thursday, May 2, 2013, Ken Mandelberg wrote:

> The last time I asked about this there was skepticism about using Amazon
> Glacier for cloud storage of backups generated by Bacula.
>
> I still have some interest in this particularly for off site backups for
> disaster recovery at least for occasional snapshots.
>
> Has anyone actually had success using Glacier with Bacula?
>
> I have done some minimal tests using
>
>
> https://github.com/uskudnik/amazon-glacier-cmd-interface/blob/master/doc/Scripting.rst
>
> It's not very usable for daily backups as is. The backup volumes get
> appended to until they reach their maximum and would be sent repeatedly
> until they hit the max and a new volume is created. If I got around this
> by some how forcing a new volume each time it would result in small
> files being sent to Glacier, which is not desirable.
>
> I suppose I could decouple the Glacier send from the Bacula director and
> just send finished volumes when a cron sees that a new one has been
> created.
>
> At any rate, just wondering if anyone is using Bacula with Glacier and how.
>
>
>
> ------------------------------------------------------------------------------
> Get 100% visibility into Java/.NET code with AppDynamics Lite
> It's a free troubleshooting tool designed for production
> Get down to code-level detail for bottlenecks, with <2% overhead.
> Download for free and get started troubleshooting in minutes.
> http://p.sf.net/sfu/appdyn_d2d_ap2
> _______________________________________________
> Bacula-users mailing list
> Bacula-users@lists.sourceforge.net <javascript:;>
> https://lists.sourceforge.net/lists/listinfo/bacula-users
>
------------------------------------------------------------------------------
Get 100% visibility into Java/.NET code with AppDynamics Lite
It's a free troubleshooting tool designed for production
Get down to code-level detail for bottlenecks, with <2% overhead.
Download for free and get started troubleshooting in minutes.
http://p.sf.net/sfu/appdyn_d2d_ap2
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to