I set rlimit AS to protect against 'runaway' processes that allocate
too much VM so Perl dies with "Out of Memory".
The __DIE__ handler is called, but it cannot write to STDOUT (which is
connected to the TCP connection to the client) because STDOUT has been
closed.
is there something else I h
On Mon, Mar 22, 2010 at 12:50 PM, ARTHUR GOLDBERG wrote:
> Is there a way to get a mod_perl process that dies with "out of memory"
> trapped by theĀ "ErrorDocument 500" handler?
If it crashes, it's too late. If it's actually perl pre-empting the
crash, then it ought to work. The key is whether y
Hello All
Resending, as nobody replied.
Is there a way to get a mod_perl process that dies with "out of
memory" trapped by the "ErrorDocument 500" handler?
I'm running Perl programs in mod_perl in Apache (2.2) on RHEL, using
the prefork MPM.
I want to protect my server against Perl process
On 16 Mar 2010, at 21:00, ARTHUR GOLDBERG wrote:
> Is there a way to configure this, or another way to provide some error output
> to a browser that sent a Request that caused the server process to die?
Error 500 is totally different to server dying. If you could bail out without
dying
(soft
Hello
I'm running Perl programs in mod_perl in Apache (2.2) on RHEL, using
the prefork MPM.
I want to protect my server against Perl processes that grow much too
large, as they can slow or even freeze the system. So I've setup an
address space resource limit via Perl's Apache2::SizeLimit. P