Hi all,

On Fri, 11 Dec 2015 11:07:54 -0500
Shawn H Corey <shawnhco...@gmail.com> wrote:

> On Fri, 11 Dec 2015 16:28:39 +0100
> Ori Raz <fcb...@gmail.com> wrote:
> 
> > Hi,
> > Did anyone encounter the scenario where scp_put is failing (too many
> > arguments) when the directory contains too many files?
> > We have 36K files in the directory and it is failing... (with lower
> > amount it works fine)
> > 
> > This is how we use it:
> > $dr_node->scp_put( { recursive => 1,
> > glob => 1,
> > copy_user_attrs => 1 } ,
> > "$ib_backup_path/*",$ib_backup_path );
> > 
> > And we get the error:
> > Can't exec "rsync": Argument list too long at
> > /perl/lib/perl5/site_perl/5.16.0/Net/OpenSSH.pm line 1433
> > 
> > Appreciate any advise :)
> > 
> > Thanks.  
> 
> rsync(1) is a UNIX utility http://linux.die.net/man/1/rsync Because of
> this, the only solution I can think of is to do one file at a time. Set
> up a loop to read their names and send them on at a time.
> 

To increase performance one can either:

1. Segment the list of files into $N-size
chunks (where $N is a relatively large integer) - one can use natatime (= "N at
a time") from List::MoreUtils ( https://metacpan.org/pod/List::MoreUtils ) for
that.

2. Copy an entire directory along with all of its contents.

I wonder why Net::OpenSSH is using rsync, though.

Of note here is Rob Pike's note about the argument list being limited in size
on UNIX-systems here:

Regards,

        Shlomi Fish

-- 
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/


Reply via email to