On 03/31/2016 09:53 AM, m.r...@5-cent.us wrote:
Oddity: rsync *should* be recursing, and dealing with very large number of
files. It works, going from box a to box b. But when I try to back b up to
c, it fails, 100%, complaining of "out of hashtable space [sender]". I've
tried adding -r, and changing --delete to --delete-delay, and no joy.

All boxes are current, or fairly current, CentOS 7.

The only thing I know of that's likely to cause rsync to run out of memory
is when there are a huge number of hard links and you are using the "-H"
option to preserve them. (If you think you don't have many, look under
/var/lib/yum/yumdb and /usr/share/zoneinfo.)

And FYI, rsync doesn't do a very good job of preserving hard links when
going to a destination that already has some of the files. It doesn't
break any existing hard links at the destination, so you can end up with
a hard link topology that is somewhat different from the source.  I have
to run some very messy audits to make sure all copies of my backups have
the same arrangement of hard-linked files.

--
Bob Nichols     "NOSPAM" is really part of my email address.
                Do NOT delete it.

_______________________________________________
CentOS mailing list
CentOS@centos.org
https://lists.centos.org/mailman/listinfo/centos

Reply via email to