Well, "large" can span orders of magnitude, depending who you ask. :-)

Can you please try to identify some steps to reproduce that include the
big file generation? For example:

  # 100MB file, random data (difficult to compress)
  head -c 100M /dev/urandom > foo

  # 100MB file, all 0s (easy to compress)
  head -c 100M /dev/zero > foo

Then you can try to rsync it to localhost via ssh to a different
directory (I assume you're using ssh and not the rsync protocol).
Thanks!

-- 
You received this bug notification because you are a member of Ubuntu
Bugs, which is subscribed to Ubuntu.
https://bugs.launchpad.net/bugs/1971932

Title:
  error in rsync protocol data stream

To manage notifications about this bug go to:
https://bugs.launchpad.net/ubuntu/+source/rsync/+bug/1971932/+subscriptions


-- 
ubuntu-bugs mailing list
ubuntu-bugs@lists.ubuntu.com
https://lists.ubuntu.com/mailman/listinfo/ubuntu-bugs

Reply via email to