Re: Don't use -v option with 2.4.6

2001-06-07 Thread Eric Whiting
John, Wayne's fixes solved 2.4.6 hangs I had been seeing for a long time. Actually I've been running 2.3.2 because the 2.4.6 hangs were killing me... Now I'm slowly moving all my apps up to the 2.4.6+wayne patch version of rsync. I hope his patch (or something similar) gets included in the offici

Re: Don't use -v option with 2.4.6

2001-06-07 Thread Wayne Davison
> Remi Laporte wrote: > > Rsync 2.4.6 -v option is bugged and cause hangs of the transfers, so if > > you have such problem, first think to remove the -v. On Thu, 7 Jun 2001, John E. Mayorga wrote: > Can anyone else confirm this? My recently posted anti-hang patch should hopefully fix this for y

@ERROR: max connections (16) reached - try again later

2001-06-07 Thread Phil Howard
I've been getting this problem, as mentioned in a prior post: @ERROR: max connections (16) reached - try again later even after 1 connection and after killing that connection. But it appears to be a problem in the Linux kernel 2 point 4 DOT OH! An upgrade to 2.4.5 clears it up. -- -

Re: too many files

2001-06-07 Thread Phil Howard
If that stores the examined file name/status in the file INSTEAD of in memory, this could work. If it memory maps that file, it could still end up being a problem at a later date (this whole thing could end up being scaled up to 50x the size if it gets committed, and a solid backup would be cruci

Re: Don't use -v option with 2.4.6

2001-06-07 Thread John E. Mayorga
All, Can anyone else confirm this? I've been getting aborts, and I just tried taking out the "-v", but the stats are not always generated by the "--stats" option, so I think it is still aborting. Thanks, John Remi Laporte wrote: > Hi all, > > Rsync 2.4.6 -v option is bugged and cause hangs o

Re: too many files

2001-06-07 Thread Dale Phillips
I got around this by using the --temp=/SomeLargeFreeDir But again - its getting around it not fixing it. Dale --- [EMAIL PROTECTED] wrote: > I'm seeing this problem too, and the problem isn't the > amount of data or number of files transferred. It's the > number of files examined. It'll fail r

Re: unexpected EOF in read_timeout on large file lists

2001-06-07 Thread Dale Phillips
I had the same error this is what fixed it for me /usr/bin/rsync -avc --stats --timeout=0 \ Hostname:/source/ /TargetOnLocalHost Note the --timeout and you can also speed the process up by dropping the c from the -avc. Hope this helps Dale --- Wileczek_Mickaël <[EMAIL PROTECTED]> wrote: >

Re: too many files

2001-06-07 Thread tim . conway
I'm seeing this problem too, and the problem isn't the amount of data or number of files transferred. It's the number of files examined. It'll fail rsyncing two already identical enormous directory structures. Tim Conway [EMAIL PROTECTED] 303.682.4917 Philips Semiconductor - Colorado TC 1880

unexpected EOF in read_timeout on large file lists

2001-06-07 Thread Wileczek Mickaël
I am trying to rsync 2Go file systems with lots of tiny file thru a low link Aproximatively 2 hours after the beginning, the rsync process stops with the error 'unexpected EOF in read_timeout' leaving rsh(rsync) process on the distant machine.   I've seen such problem several times on the

Re: too many files

2001-06-07 Thread John N S Gill
> I'm going to try to split things up into multiple rsync runs. The > great difficulty with this is that names at the first level I could > split on are changing, meaning I am going to have to manually handle > this about 2 or 3 times a week. It should be an automated thing > that just does th