>>>>> On Tue, 04 Jun 2019 18:08:16 -0500, Larry Rosenman said:
> 
> On 06/04/2019 6:05 pm, Chandler wrote:
> > Larry Rosenman wrote on 6/4/19 15:46:
> >> I was just wondering if it would make sense for Bacula to become 
> >> smarter about this,
> >> and be helpful and not duplicate the files.
> > 
> > Well bacula will do what you tell it, so it's up to you to figure out
> > if there are duplicated entries in your Fileset.
> > 
> > 
> > 
> 
> I was kind of hoping it would sort the list into the include/exclude 
> lists, and especially
> if OneFS = yes is set, only hit a path ONCE.
> 
> But I guess not.
> 
> Oh well, was worth asking.

OneFS = yes will prevent bacula from descending into mounted filesystems, so
will prevent duplicates of those.  There is no detection of duplicates within
a single filesystem though.

You can check for duplicate files in the catalog using this query (works on
PostgreSQL at least):

select dup.jobid, path.path, filename.name, countof
 from (select jobid, pathid, filenameid, count(*) as countof
        from file
        group by jobid, pathid, filenameid having count(*)>1) as dup
 inner join path on dup.pathid = path.pathid
 inner join filename on dup.filenameid = filename.filenameid
 where not filename.name = ''
 limit 100;

Remove the limit 100 if you want to see them all.

__Martin


_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to