On Sat, Sep 17, 2011 at 6:15 PM, Rajeev Prasad <rp.ne...@yahoo.com> wrote:

> Hi Salva,
> Thx for responding. I need to SSH into several nodes and run some commands
> there and capture the modified output to a file on local node from where the
> script is run.

The easiest way to do that with Net::OpenSSH::Parallel is to retrieve
the unmodified output of the commands you are running to local files
and then postprocess them.

  $pssh->push('*', cmd => {stdout_file => '/tmp/out-%HOST%'}, $cmd);
  $pssh->run;

  for my $host (@hosts) {
    open my $fh, '<', "/tmp/out-$host" or die "$!";
    while (<$fh>) {
      ...
    }
  }


You can also use an external program to process the output on the fly:

  $pssh->push('*', cmd => {stdout_file => ['|-', 'grep foo
>/tmp/out-%HOST%']}, $cmd);


Or you can use the 'parsub' action to handle the output in perl
yourself (though, in a different process):

  open my $out, '>', $filename or die $!;

  sub worker {
     my ($label, $ssh) = @_;
     my $data = $ssh->capture($cmd);
     my $processed_data = process($data);
     flock($out, LOCK_EX);
     print $out $processed_data;
     flock($out, LOCK_UN);
  }

  $pssh->push("*", parsub => \&worker);
  $pssh->run;


What you can't do with Net::OpenSSH::Parallel is to process the output
of the remote commands in the same process on the fly or to plug the
output of some command run in some host to the input of another
process running in a different host. If you want to do that, you
should go with some of the Perl event-based frameworks (AnyEvent, POE)
or use threads.

In my experience, the first approach I have described (using
intermediate temporal files) is usually the right one.

-- 
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/


Reply via email to