> If script startup and module loading really is the culprit you could try the
> mod_perl approach.
>
> Load all required modules and then for each script, fork a new perl process
> which uses do "testxxx.t" to run each script.

That's a good idea - thanks!

I gave it a try and these are the times I got:

    Time   Method
    ----   ------
    6:09   prove -r tests/
    4:14   for i in tests/**/*.t ; do perl $i; done
    2:57   runscripts-forking.pl tests/**/*.t

This is for a suite of 165 test scripts.

So it does look like there are efficiencies to be had, it's just a
question of whether it's worth the bother (e.g. to figure out how to
parse the output of the forked scripts).

runscripts-forking.pl basically looks like this:

    #!/usr/bin/perl

    use strict;
    # ... use a ton of modules here ...

    foreach my $script (@ARGV) {
        warn "Script: $script\n";
        unless (runscript($script)) {
            warn "FAILED: Script $script: $! [EMAIL PROTECTED]";
            last;
        }
    }

    sub runscript {
        my $script = shift;

        my $pid;
        if (!defined($pid = fork)) {
            warn "Cannot fork: $!\n";
            return;
        }
        elsif ($pid) {
            my $ret = waitpid($pid, 0);
            return $ret;
        }
        do $script or die "Compile errors: $script: [EMAIL PROTECTED]";
        exit;
    }





Michael


--
Michael Graham <[EMAIL PROTECTED]>


Reply via email to