>>>>> On Wed, 19 Sep 2007 17:52:40 +0200, Cedric Devillers said:
> 
> Martin Simmons wrote:
> >>>>>> On Wed, 19 Sep 2007 11:54:37 +0200, Cousin Marc said:
> >> I think the problem is linked to the fact dbcheck works more or less row 
> >> by 
> >> row.
> >>
> >> If I understand correctly, the problem is that you have duplicates in the 
> >> path 
> >> table as the error comes from 
> >> SELECT PathId FROM Path WHERE Path='%s' returning more than one row
> >>
> >> You could try this query, it would probably be much faster :
> >>
> >> delete from path 
> >> where pathid not in (
> >>    select min(pathid) from path 
> >>    where path in 
> >>            (select path from path group by path having count(*) >1) 
> >>    group by path) 
> >> and path in (
> >>    select path from path group by path having count(*) >1);
> >>
> >> I've just done it very quickly and haven't had time to doublecheck, so 
> >> make a 
> >> backup before if you want to try it... :)
> >> Or at least do it in a transaction so you can rollback if anything goes 
> >> wrong.
> > 
> > Deleting from path like that could leave the catalog in a worse state than
> > before, with dangling references in the File table.  The dbcheck routine
> > updates the File table to replace references to deleted pathids.
> > 
> > Moreover, if deleting duplicate pathids is slow (i.e. there are many of 
> > them),
> > then the catalog could be badly corrupted, so I don't see how you can be 
> > sure
> > that the File records are accurate.  It might be better to wipe the catalog
> > and start again, or at least prune all of the file records before running
> > dbcheck.
> > 
> > __Martin
> 
> I think that the approach marc suggested may not be that bad in my case.
> 
> Taking a closer look at duplicate paths show that it is not conflicting
> PathID, but 2 rows of the same PathID.
> 
> Here is an example :
> 
> restorebacula=# SELECT PathId FROM Path WHERE
> Path='/home/tbeverdam/Maildir/';
>  pathid
> --------
>   12251
>   12251
> (2 rows)
> 
> 
> So i suppose that deleting one of these entries should not put the
> catalog in a corrupted state (correct me if i'm wrong).
> 
> I'll try this and do some test with my testing catalog.

Yes, it would be safe if all of the duplicates are like that.

__Martin

-------------------------------------------------------------------------
This SF.net email is sponsored by: Microsoft
Defy all challenges. Microsoft(R) Visual Studio 2005.
http://clk.atdmt.com/MRT/go/vse0120000070mrt/direct/01/
_______________________________________________
Bacula-users mailing list
Bacula-users@lists.sourceforge.net
https://lists.sourceforge.net/lists/listinfo/bacula-users

Reply via email to