On Thursday 15 June 2006 18:52, Alan Brown wrote:
> On Thu, 15 Jun 2006, Kern Sibbald wrote:
> >> Kern, how well does bacula cope with directories that have 300,000+
> >> files in them? (no, not being humourous)
> >
> > Well, it really should not have much trouble backing them up or restoring
> > them, though the restore may be a bit slower when creating so many files
> > in one directory -- this is really OS and memory size dependent.
>
> It's the restore tree build times that I am worried about, however using
> GFS even running a directory listing stops everything for several minutes.

Uh, exactly what do you mean by running a directory listing?  On the OS, or a 
dir command within the Bacula tree restore routines?   Once the tree is 
built, operations should be relatively fast.

>
> > On the other hand, building the Bacula in-memory directory tree using the
> > "restore" command could be *really* slow because to insert each new file
> > in the in memory tree goes something like O(n^2).
>
> Thanks.
>
> I have users with other filesystems with upwards of 6 million small files
> in them, however these don't have large flat directories...

Are you able to do a Bacula tree restore with 6 million files?

-- 
Best regards,

Kern

  (">
  /\
  V_V


_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to