e that for wget:
-t number
--tries=number
Set number of retries to number.
On Mar 31, 2010, at 8:14 PM, Eric Covener wrote:
On Wed, Mar 31, 2010 at 7:55 PM, ARTHUR GOLDBERG
wrote:
httpd processes die as expected when their VM size reaches 1000 MB.
But here'
Hi us...@httpd
As I mentioned in an earlier email, we're running mod_perl on Apache
(2.2) on RHEL, using the prefork MPM. I want to protect my server
against Perl processes that grow much too large, as they can slow or
even freeze the system.
Therefore, I'm using Apache2::Resource, adding t
ere something else I have the handler do so that Apache httpd
sees a 500?
BR
A
On Mar 22, 2010, at 4:21 PM, Eric Covener wrote:
On Mon, Mar 22, 2010 at 12:50 PM, ARTHUR GOLDBERG
wrote:
Is there a way to get a mod_perl process that dies with "out of
memory"
trapped by the "
Hello All
Resending, as nobody replied.
Is there a way to get a mod_perl process that dies with "out of
memory" trapped by the "ErrorDocument 500" handler?
I'm running Perl programs in mod_perl in Apache (2.2) on RHEL, using
the prefork MPM.
I want to protect my server against Perl process
Hello
I'm running Perl programs in mod_perl in Apache (2.2) on RHEL, using
the prefork MPM.
I want to protect my server against Perl processes that grow much too
large, as they can slow or even freeze the system. So I've setup an
address space resource limit via Perl's Apache2::SizeLimit. P
Hi
(Hope this isn't a dupe. I just registered.)
Running
Server version: Apache/2.2.3
Server built: Jul 6 2009 05:29:28
Our website's accessed at two domains, like site.org and x.site.edu.
It's largely driven by Perl programs loaded by mod_perl, but has some
static content.
I want to config