On Fri, 2 Nov 2007 7:47 am, Gary Sewell wrote:
Firstly, we are running a mod_perl application on 4 separate servers
due to its bulkiness.
How many requests per second are you processing with each machine? Have
you looked at profiling your application?
We have found each apache instance is almost double on the 64-bit
servers
Example 64-bit
USER PID %CPU %MEM VSZ RSS TTY STAT
START TIME COMMAND
www-data 9950 2.4 1.7 210324 141744 ? S
14:26 0:11 \_ /usr/sbin/apache-perl
Example 32-bit
USER PID %CPU %MEM VSZ RSS TTY STAT
START TIME COMMAND
www-data 2336 0.0 2.7 117100 105908 ? S
15:14 0:16 \_ /usr/sbin/apache-perl
Is this something we just put up with or have we done something
drastically wrong?
Iirc, the calculations for shared memory on linux aren't always
accurate, but toersten may have released something that addressed this,
linux::smaps I think. As a comparison, I have commonly seen apps that
take over 500 megs total process memory, but most of that is shared or
data. I've seen java apps or php apps take up comparatively as much
memory.
Is it worth installing a 32-bit distribution on the 64-bit processors?
Will the 8Gb RAM cause problems as 32-bit distributions if we did this?
Probably not worth it. I run mod_perl on a 64 bit machine with 16 gigs
and no problems.
How can I get a split/rundown of what is taking up so much memory for
each apache instance, 100Mb is a lot, let alone 200Mb on the 64 bit
servers. I’m sure we are going wrong somewhere.
Apache2::Status will give you a detailed breakdown of where your memory
is used.
Many Thanks.
GS
Internal Virus Database is out-of-date.
Checked by AVG Free Edition.
Version: 7.5.503 / Virus Database: 269.15.11 - Release Date: 25/10/2007
00:00