Sorry for the additional post, but it's a "fun" problem when you consider
the possible messed up file/path names that could be created by a raw
database dump. I made some improvements to the example script to make
sure messed up file/path names are handled correctly.
**
On Mon, 28 Dec 2009 18:40:57 -0800 "J.C. Roberts"
wrote:
> # The `while read VAR;do ...; done < file.txt;` construct fails when
> # a backslash followed by a space is present in the input file. In
> # essence it seems to be escaping the space, and thereby dropping the
> # leading backslash. i.e.
On Fri, 11 Dec 2009 18:24:24 -0500 "STeve Andre'"
wrote:
>I am wondering if there is a port or otherwise available
> code which is good at comparing large numbers of files in
> an arbitrary number of directories? I always try avoid
> wheel re-creation when possible. I'm trying to help some-
On Saturday, December 12, 2009, Andy Hayward wrote:
> On Fri, Dec 11, 2009 at 23:24, STeve Andre' wrote:
>> B I am wondering if there is a port or otherwise available
>> code which is good at comparing large numbers of files in
>> an arbitrary number of directories? B I always try avoid
>> wheel
On Fri, Dec 11, 2009 at 23:24, STeve Andre' wrote:
> B I am wondering if there is a port or otherwise available
> code which is good at comparing large numbers of files in
> an arbitrary number of directories? B I always try avoid
> wheel re-creation when possible. B I'm trying to help some-
> on
On 11 December 2009, STeve Andre' wrote:
>I am wondering if there is a port or otherwise available
> code which is good at comparing large numbers of files in
> an arbitrary number of directories? I always try avoid
> wheel re-creation when possible. I'm trying to help some-
> one with large
On 12/12/2009, at 4:22 PM, Frank Bax wrote:
STeve Andre' wrote:
but am trying to come up with a reasonable way
of spotting duplicates, etc.
You mean like this...
$ cp /etc/firmware/zd1211-license /tmp/XX1
$ cp /var/www/icons/dir.gif /tmp/XX2
$ fdupes /etc/firmware/ /var/www/icons/ /tmp/
/tmp
On 12/11/09, STeve Andre' wrote:
> I should have been more clear I suppose. I'd like to know
> the files that are identical, files that are of the same
> name but different across directories, possibly several
> directories.
Unison is in ports. Enjoy :)
--
http://www.glumbert.com/media/shif
STeve Andre' wrote:
but am trying to come up with a reasonable way
of spotting duplicates, etc.
You mean like this...
$ cp /etc/firmware/zd1211-license /tmp/XX1
$ cp /var/www/icons/dir.gif /tmp/XX2
$ fdupes /etc/firmware/ /var/www/icons/ /tmp/
/tmp/XX2
/var/www/icons/dir.gif
/var/www/icons/fo
On Friday 11 December 2009 19:11:18 anonymous wrote:
> On Fri, Dec 11, 2009 at 06:24:24PM -0500, STeve Andre' wrote:
> >I am wondering if there is a port or otherwise available
> > code which is good at comparing large numbers of files in
> > an arbitrary number of directories? I always try av
On Sat, Dec 12, 2009 at 02:31:54AM +0100, Alexander Bochmann wrote:
> Hi,
>
> ...on Fri, Dec 11, 2009 at 06:52:09PM -0500, STeve Andre' wrote:
>
> > > Compare how?
> > I should have been more clear I suppose. I'd like to know
> > the files that are identical, files that are of the same
> > n
On Fri, Dec 11, 2009 at 8:31 PM, Alexander Bochmann wrote:
> find . -type f -print0 | xargs -0 -r -n 100 md5 -r > md5sums
>
> You could now just sort the md5sums file to find
> all entries with the same md5... Or sort by filename
> (will need some more logic if files are distributed
> over severa
On Friday 11 December 2009 20:31:54 Alexander Bochmann wrote:
> Hi,
>
> ...on Fri, Dec 11, 2009 at 06:52:09PM -0500, STeve Andre' wrote:
>
> > > Compare how?
> > I should have been more clear I suppose. I'd like to know
> > the files that are identical, files that are of the same
> > name bu
Hi,
...on Fri, Dec 11, 2009 at 06:52:09PM -0500, STeve Andre' wrote:
> > Compare how?
> I should have been more clear I suppose. I'd like to know
> the files that are identical, files that are of the same
> name but different across directories, possibly several
> directories.
Maybe you co
On Friday 11 December 2009 18:36:33 Noah Pugsley wrote:
> STeve Andre' wrote:
> >I am wondering if there is a port or otherwise available
> > code which is good at comparing large numbers of files in
> > an arbitrary number of directories? I always try avoid
> > wheel re-creation when possible
Diff (1), if you want to compare specific files or dirs, or
fdupes for searching for arbitrary files in arbitrary locations.
paulm
On 12/12/2009, at 12:24 PM, STeve Andre' wrote:
I am wondering if there is a port or otherwise available
code which is good at comparing large numbers of file
On Fri, Dec 11, 2009 at 06:24:24PM -0500, STeve Andre' wrote:
>I am wondering if there is a port or otherwise available
> code which is good at comparing large numbers of files in
> an arbitrary number of directories? I always try avoid
> wheel re-creation when possible. I'm trying to help so
STeve Andre' wrote:
I am wondering if there is a port or otherwise available
code which is good at comparing large numbers of files in
an arbitrary number of directories? I always try avoid
wheel re-creation when possible. I'm trying to help some-
one with large piles of data, most of which
2009/12/12 STeve Andre' :
> I am wondering if there is a port or otherwise available
> code which is good at comparing large numbers of files in
> an arbitrary number of directories? I always try avoid
Try rsync if you just want to know which files differ.
Best
Martin
19 matches
Mail list logo