On 12/18/2012 12:02 AM, Rajeev Prasad wrote:
thx a lot Slava, it works!
lastly, does the pipe_out does not 'store' data in RAM - and streams straight
to disk? like collecting response from 'capture' in an array?
pipe_out returns a file handle from where you can read lines (or chunks
or whatever):
my $pipe = $ssh->pipe_out($cmd);
while(<$pipe>) {
# line is in $_
...
}
close $pipe or die "some error happened while reading from pipe: $!";
You can also save the ouput of the command to disk easyly:
$ssh->system({stdout_file => "/tmp/cmd-output"}, $cmd)
or die "ssh command failed: " . $ssh->error;
Using capture in list context just reads all the data into memory,
splits it into lines an returns them, it is probably the most memory
consuming approach.
rgds,
Rajeev
________________________________
From: Salvador Fandino <sfand...@yahoo.com>
To: Rajeev Prasad <rp.ne...@yahoo.com>
Cc: perl list <beginners@perl.org>
Sent: Monday, December 17, 2012 11:15 AM
Subject: Re: Net::Openssh not fetching data
On 12/17/2012 05:21 PM, Rajeev Prasad wrote:
the following is _i think_ timing out. when it is run from within my script
( @cmdresult, $cmderr ) = $ssh->capture($CMD);
where $CMD is: egrep "data_to_grep" *.data_file.txt
the output is about 300Mb of data.
further, the command when run on the remote system directly (after logging in),
takes only about 30 seconds.
also, when we run from my localhost (bash shell) ssh "$CMD" > local.file.save
it completes within 2 minutes...
please advice.
ty.
Rajeev
300MB of data may be too much for capturing and unless you are very
carefully you will end with an script requiring several GBs of memory to
run.
Try saving the output to a file and process it afterwards line by line,
or use Net::OpenSSH pipe_in method to read and process the data on the
fly without storing it all in memory at once.
--
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/