Kern Sibbald wrote (2007/01/29):
> Bacula will do a linear search through the exclude list. Thus it could be
> extremely CPU intensive. For a large list (more than 1000 files) I believe
> it (the list) needs to be put into a hash tree, which is code that does not
> exist.
Hello, I have implem
>> I'm using Bacula to backup a large file, roughly 109MB in size. I'm
>> also attempting to back up this file to an offsite storage daemon.
>>
>>
>>
>> I've tried 6 times now to get the file to backup, but each time, I get
>> this error (showing the most relevant parts):
>>
>>
>>
>> Fatal er
Dan Langille wrote:
On 29 Jan 2007 at 17:11, Brad Peterson wrote:
I'm using Bacula to backup a large file, roughly 109MB in size. I'm
also attempting to back up this file to an offsite storage daemon.
I've tried 6 times now to get the file to backup, but each time, I get
this error (s
Maybe I should be actually posting this to the dev list on account of
the version being fresh out of the oven...
Anyway, I have a weird problem going on. Both the storage daemon and
the sole client are set up with a heartbeat interval (30 seconds),
but the backup always dies five minutes
Dear All
Could someone please clarify this situation where I have jobs scheduled to
start at 09:15 with a Max Start Time of 4 hours so that no job starts
after 13:15. It seems that since I have enable Reschedule on error the Max
Start Delay seems to apply from when the job was rescheduled.
See be
On 29 Jan 2007 at 17:11, Brad Peterson wrote:
> I'm using Bacula to backup a large file, roughly 109MB in size. I'm
> also attempting to back up this file to an offsite storage daemon.
>
>
>
> I've tried 6 times now to get the file to backup, but each time, I get
> this error (showing the m
Hello,
On 1/30/2007 2:11 AM, Brad Peterson wrote:
> I'm using Bacula to backup a large file, roughly 109MB in size. I'm
> also attempting to back up this file to an offsite storage daemon.
>
>
>
>
> I've tried 6 times now to get the file to backup, but each time, I
> get this error (showing t
I'm using Bacula to backup a large file, roughly 109MB in size. I'm also
attempting to back up this file to an offsite storage daemon.
I've tried 6 times now to get the file to backup, but each time, I get this
error (showing the most relevant parts):
Fatal error: job.c:1748 Comm err
> On Mon, Jan 29, 2007 at 11:12:41PM +1100, James Harper wrote:
>
> > Failing that, is there a way that the director can tell me what the
> > label is on the current disk volume? I'm sure I have seen it tell me
on
> > a mount before, or maybe I'm thinking of tapes. It doesn't tell me
> > anything
Hello,
On Monday 29 January 2007 21:19, Alan Davis wrote:
> Kern,
>
> Thanks for the fast response. To clarify a bit - the file list that I
> would be using would be individual files, not directories. There would
> be no exclude list as only the files that I need backed up would be
> listed.
Yes
On 29 Jan 2007 at 20:28, Alan Brown wrote:
> On Sat, 27 Jan 2007, Dan Langille wrote:
>
> >
> > As someone else pointed out, it's a shame the standard method for
> > getting more information about an application involves putting the
> > application name second. Otherwise, everyone would be typi
On Sat, 27 Jan 2007, Dan Langille wrote:
>
> As someone else pointed out, it's a shame the standard method for
> getting more information about an application involves putting the
> application name second. Otherwise, everyone would be typing:
>
> bat man
>
>
http://en.wikipedia.org/wiki/Man-
Kern,
Thanks for the fast response. To clarify a bit - the file list that I
would be using would be individual files, not directories. There would
be no exclude list as only the files that I need backed up would be
listed.
I have about 30TB of data files spread over several hundred directories.
On 1/29/07, Aaron Knister <[EMAIL PROTECTED]> wrote:
> I have an ADIC i500 tape library with 36 slots and two LTO-3 tape
> drives. There are currently 8 tapes in the loader and they're defined in
> a storage pool (and labeled by bacula). I have two jobs that reference
> this storage pool. However,
Hello,
This is just to let you know that the Bacula GUI project to create a GUI admin
tool, now officially named bat, is well underway. In a sense the bat is now
born and functions. There remains a tremendous amount of work, but the base
is there -- a Qt graphical interface that connects to
On Monday 29 January 2007 18:17, Alan Davis wrote:
> I understand that one of the projects is to incorporate features that
> will make very large exclude lists feasible, but does anyone have
> experience, good or bad, with very large include lists in a fileset?
>
>
>
> I'm looking at the possibilit
> On Fri, 26 Jan 2007 15:00:03 -0500, Dan Langille said:
>
> We're doing a large backup (1.5 million files). We have Spool
> Attributes = Yes. I suspect it is spooling the attributes now and
> has been doing so for a few hours.
>
> Is "status client" expected to work during the spooling o
Hello,
I am putting bacula through it's paces and have run across some slowness
when purging volumes. I am using freebsd 6.1, bacula 1.38.11_1, and
postgresql-server-7.4.13_1 on a dual xeon with 2GB of ram. I have three
15-spindle raid5 arrays (5.6 TB each) to manage spooling and database
needs.
I have an ADIC i500 tape library with 36 slots and two LTO-3 tape
drives. There are currently 8 tapes in the loader and they're defined in
a storage pool (and labeled by bacula). I have two jobs that reference
this storage pool. However, when I run I can only get one of the drives
to work at a
Hi,
On 26 Jan 2007 at 15:00, Dan Langille wrote:
> We're doing a large backup (1.5 million files). We have Spool
> Attributes = Yes. I suspect it is spooling the attributes now and
> has been doing so for a few hours.
>
> Is "status client" expected to work during the spooling of
> attribute
Hi list,
I am using freebsd 6.1, bacula 1.38.11_1, and postgresql-server-7.4.13_1.
I run concurrent jobs and the following errors seem to appear when
attributes are despooling into the catalog. What do these mean and how do
I fix them?
Jan 27 10:43:56 postgres: [58-1] ERROR: table "delcandidate
I understand that one of the projects is to incorporate features that
will make very large exclude lists feasible, but does anyone have
experience, good or bad, with very large include lists in a fileset?
I'm looking at the possibility of building a backup list from a db query
that has the pot
> Using DVD-RAM in combination with bacula is on my projects list. Is anybody
> actually doing this?
Yes, I am doing this on my home Linux workstation.
I use a Samsung GSA-2164D dvdrecorder with a plain (no cartridge) DVD-RAM disk.
I'm in the first stage of "try-and-see-what-happens" and I configu
Erik,
> I would appreciate receiving a copy of your script since I have the same
> problem and haven't done anything yet to solve it :-)
No problem.
bacula_next_volumes is the same as next_volumes.sh from my
previous mail.
prompt% bacula_next_volumes -h
Usage: bacula_next_volumes [options]
Outp
Hi,
i think the RetentionTime (20h) is too long.
After writing 6-8 hours to the media,
and then adding the Retention of 20 hours,
the volume is reusable after 26-28 hours.
why not using RetentionTime of 1 minute?
bacula will recycle the volumes at jobstart, when no appendable volume
is available.
On Mon, 29 Jan 2007, James Harper wrote:
> Failing that, is there a way that the director can tell me what the
> label is on the current disk volume? I'm sure I have seen it tell me on
> a mount before, or maybe I'm thinking of tapes. It doesn't tell me
> anything useful when I try it now though.
I need to do the same and achieve it by running two extra
jobs: The first checks that what volumes are in the tape
drives and sends an email with volume requirements. The
second checks that the requirements have been met and only
sends another email if they're not. These run early and late
in the a
Peter Selc wrote:
> Hello,
>
> I'm having problems with configuring one backup. It should be
> every-day full backup, runned at 00:05, where about 100GB of files are
> backed up (compressed approx. 50GB). It takes about 6-8 hours (client
> and storage are 2 different servers on the same network)
On 29 Jan 2007 at 23:12, James Harper wrote:
> I'm sure I've asked this before a while ago, but I can't seem to see it
> in my mailbox.
>
> Does Bacula have any way of allowing a pre-backup check? One of our
> clients is currently using Amanda, and for all the things I don't like
> about it, one
I'm sure I've asked this before a while ago, but I can't seem to see it
in my mailbox.
Does Bacula have any way of allowing a pre-backup check? One of our
clients is currently using Amanda, and for all the things I don't like
about it, one really nifty feature is that it can do a check and see if,
On Monday 29 January 2007 09:59, Florian Schürfeld wrote:
> Hi there,
>
> can someone provide me with a more detailed description of Volume
> Status.
>
> I fully understood about "Append" and "Full", but i.g. wheres the
> differnce between "Used" and "Full".
This is a good subject for documentatio
On Fri, 26 Jan 2007, cy tune wrote:
> It's about $8 per disk for a Verbatim double sided DVD-RAM disk (9.4
> GB) in a cartridge. Going the tape route would cost far, far more
> wouldn't it?
It's about $30 for a 200Gb (native) LTO2 tape.
Tapes are well-tested for longevity (EXCEPT DATS!)
My exp
Hi there,
can someone provide me with a more detailed description of Volume
Status.
I fully understood about "Append" and "Full", but i.g. wheres the
differnce between "Used" and "Full".
regards
--
Florian Schürfeld <[EMAIL PROTECTED]>
33 matches
Mail list logo