On Jun 24, 12:15 am, nora.hac...@stgkk.at ("HACKER Nora") wrote:
> Hi list,
>
> I am currently having an issue with background processes. I already
> searched the web and found a way how to principally send a process into
> background, wait for the execution to finish and then go on with the
> main program:
>
> my $pid = fork;
> if ( $pid ) {
>     wait;
>     print "done.\n";} else {
>
>     print "starting.\n";
>     system "compress *.dbf &";
>     exit;
>
> }
>
> The "done." is being issued at the correct point of time, namely when
> really all files have been compressed. Unfortunately, this only
> compresses one file after the other and therefore lacks performance. So
> I tried to send multiple processes into background and have them to be
> executed simultaneously. I tried the following:
>
> my $pid = fork;
> if ( $pid ) {
>         wait;
>         print "done.\n";} else {
>
>         print "starting.\n";
>         foreach ( glob "*.dbf" ) {
>                 system "compress $_ &";
>         exit;
>
> }
>
> This behaves as expected, from the "simultaneous" and "background" point
> of view - but my big problem is now that the message "done." is being
> issued as soon as all the compression processes are ISSUED instead of
> when they are FINISHED.
>
> Could anybody please point me into the right direction how to best solve
> that issue? Maybe I should fork all the single compression processes to
> get a pid for each, put them into an array and check with a loop whether
> they still exist (process running)? Would there be another / easier /
> more efficient way?
>

The backgrounding makes this harder because the system()
call will return immediately even if the backgrounded process
is still running.  You can reap the b/g processes with an async
waitpid (see: perldoc -f waitpid. Also perldoc perlipc) but this
is a bit tricky and you'll need  to to poll for completion with a
messy sleep-loop of some kind.

So I'd ditto the recommendation for Parallel::ForkManager
which'll background the tasks and also provides a  method
'set_max_procs' to  avoid hogging the system.

Another possibility, if impact to system isn't a concern,  is
IPC::Run  which'll background the tasks and handle other
messy details for you.

use IPC::Run qw/start finish/;
use strict;
use warnings;

my @cmds;

# Or easier on the system, wrap rest of code in a loop
# to process only some max. no. of files at a time

foreach ( glob "*.dbf" ) {
       push @cmds,  [  compress =>  $_  ];
}
my $harness = start map { ($_, '&') } @cmds;
$harness->finish or die "finish error: $?";
print "done...\n";

__END__

--
Charles DeRykus


--
To unsubscribe, e-mail: beginners-unsubscr...@perl.org
For additional commands, e-mail: beginners-h...@perl.org
http://learn.perl.org/


Reply via email to