Am Sat, Sep 07, 2024 at 10:37:04AM +0100 schrieb Michael:
> On Friday 6 September 2024 22:41:33 BST Frank Steinmetzger wrote:

> > > > > Someone more knowledgeable should be able to knock out some clever
> > > > > python
> > > > > script to do the same at speed.
> > 
> > And that is exactly what I have written for myself over the last 11 years. I
> > call it dh (short for dirhash). As I described in the previous mail, I use
> > it to create one hash files per directory. But it also supports one hash
> > file per data file and – a rather new feature – one hash file at the root
> > of a tree. Have a look here: https://github.com/felf/dh
> > Clone the repo or simply download the one file and put it into your path.
> 
> Nice!  I've tested it briefly here.  You've put quite some effort into this.  
> Thank you Frank!
> 
> Probably not your use case, but I wonder how it can be used to compare SOURCE 
> to DESTINATION where SOURCE is the original fs and DESTINATION is some 
> backup, 
> without having to copy over manually all different directory/subdirectory 
> Checksums.md5 files.

When I have this problem, I usually diff the checksum files with mc or vim, 
because I don’t usually have to check many directories and files. You could 
use Krusader, a two-panel file manager. This has a synchronise tool with a 
file filter, so you synchronize two sides, check for file content and filter 
for *.md5.

> I suppose rsync can be used for the comparison to a backup fs anyway, your 
> script would be duplicating a function unnecessarily.

I believe rsync is capable of only syncing only files that match a pattern. 
But it was not very easy to achieve, I think.

-- 
Grüße | Greetings | Salut | Qapla’
Please do not share anything from, with or about me on any social network.

They say that memory is the second thing to go...
I forgot what the first thing was.

Attachment: signature.asc
Description: PGP signature

Reply via email to