Hi,
Is it possible to have 2 versions of perl that use mod_perl and libapreq?
I don't need to use 2 versions permanently, but I need to have a version
that works while I install another version of perl with all the needed
modules including mod_perl and libapreq, then switch to the new version.
I
Hey Rod,
You're only supposed to get 2 processes on win32. The win32 mpm
supports only one master process and one child processes (which causes
all sorts of issues and delays when the child process segfaults :-( ).
The child process loads by default with 250 worker threads which are the
"instance
Hi Bill,I don't believe that these are the issues. Just for grins, I logged on to a couple of servers and dumped the jobs in via a browser. As I suspected, it was no problem and I saw them all running simultaneously. Next I opened a copuple more IE windows and tried the same thing. No problem.
Now,
On Tue, 2006-03-07 at 04:43 -0800, Rod Morris wrote:
> Now, it appears that what I'm seeing is related to the browser. With
> IE, mutilple requests to one perl script are no issue. With Firefox,
> multiple requests are queued and served in order. Strange.
>
> Any ideas?
Difference in keep-alive b
On Tue, 2006-03-07 at 11:53 +0200, Octavian Rasnita wrote:
> Is it possible to have 2 versions of perl that use mod_perl and libapreq?
Yes. You just need to tell your new perl what path to live in when you
build it. I think the option is named "PREFIX." Same for apache, and
make sure you use th
Khai Doan wrote:
> -8<-- Start Bug Report 8<--
> 1. Problem Description:
>
>Calling $r->read($buf,0) result in an internal server error, and an entry
> in error_log: The LENGTH argument can't be negative at .
A slightly confusing error message, but
Chris Werner wrote:
Mod_perl list, Apache 2.2.0 mod_perl 2.0.2
I have been working with an alternative protocol implementation for apache
for some time now and [as would be expected with a perl based tool] I have
found many ways to accomplish the same task. I am asking about the
philosophy or
an old issue:
"a dream solution would be if all child processes could *update* a large global structure."
we have a tool that loads a huge store of data (25-50Mb+) from a database into many perl hashes at start up: each session needs access to all these data but it would be prohibitive to use
at this point, the application is on a single machine, but I'm being tasked with moving our database onto another machine and implement load balancing b/w 2 webservers.
william
On 3/7/06, Will Fould <[EMAIL PROTECTED]> wrote:
an old issue:
"a dream solution would be if all child processes