On 14/5/24 14:18, Ed Sabol wrote:
On May 13, 2024, at 11:00 PM, Steven Haigh via modperl 
<modperl@perl.apache.org> wrote:
If I was to guess, it seems like an interaction with open3 and modperl.

https://perldoc.perl.org/IPC::Open3

Yes, this is a known problem with IPC::Open3 that is commonly seen with 
mod_perl. It's been a Perl bug since 2009, and nobody has apparently been 
interested in fixing it. :(

https://github.com/Perl/perl5/issues/9759

Damn, that's interesting to find out... and annoying :)

So I think you just need to work around it. I recommend not using IPC::Open3 in 
any mod_perl projects. I typically just use backticks and redirect stderr to 
stdout or to a temp file, but sometimes you just need separate filehandles. For 
that, I offer the following potential workarounds that you can try:

(1) Use IPC::Run instead of IPC::Open3. I've seen at least one report that says 
that IPC::Run works well in mod_perl, and it can do similar things.

https://metacpan.org/pod/IPC::Run

I did have some success with IPC::Run - and that does seem to work as expected - however I hit buffering issues that I didn't see when using the open3 code.

One of the tools I use this code with is a looking glass - so you can perform traceroutes to various places.

With using cgi-script + open3, this is returned line by line to the browser - which is really nice.

Using IPC::Run, you get no output until the traceroute has completed.

I haven't spent a ton of time to see if that buffering can be altered in IPC::Run as yet, but either way, it becomes a less-optimal result to just blat the entire traceroute to the screen at once.

It can take quite a while for a traceroute to complete if theres 20+ hops that don't respond - which means you get no output at all until its done.

(2) Re-jigger the file descriptors as shown here:

https://stackoverflow.com/questions/2097247/ipcopen3-fails-running-under-apache/24311232#24311232

I did try this, but the output still went to the apache logs.

For the traceroute example, I tried this code:

        my $cmd = "/usr/bin/stdbuf";
        my @args = qw@ -o1 -e1 /usr/bin/traceroute @;

        if ( $vars->{'type'} eq "ipv6" ) {
                push @args,"-6";
        } else {
                push @args,"-4";
        }

        push @args, $vars->{"host"};

        print '<pre class="alert">';
        my ($save_stdin,$save_stdout);
        open $save_stdin, '>&STDIN';
        open $save_stdout, '>&STDOUT';
        open STDIN, '>&=0';
        open STDOUT, '>&=1';

        my $pid = open3(0, (my $pipe = gensym), 0, $cmd, @args);
        while ( my $output = <$pipe> ) {
                print $output;
        }

        close(STDIN);
        close(STDOUT);
        open STDIN, '>&', $save_stdin;
        open STDOUT, '>&', $save_stdout;
        print "Trace Complete</pre>";

This still printed the output to the apache logs, along with an error:

Warning: unable to close filehandle $save_stdin properly: at /var/www/html/lg/index.pl line 66.

I'm starting to wonder that if with the various 'strange' behaviours, it might be easier to not use mod_perl for this and keep using the cgi-script handler. It isn't a high performance or popular site at all - I've been using it more as a learning exercise - but I'm wondering if the benefits of using mod_perl here are purely academic :)


--
Steven Haigh

📧 net...@crc.id.au
💻 https://crc.id.au

Reply via email to