On Dec 2, 2008, at 11:21 PM, [EMAIL PROTECTED] wrote:

Is there a cross-platform way to launch multiple Python processes
and monitor CPU usage

os.getloadavg() might be useful. It certainly works on *nix, don't know about Windows. The documentation doesn't mention any platform limitations.


HTH
Philip



and disk i/o so that the maximum number of
independent processes can be running at all times without
overloading their environment? By process I mean independent
application sessions vs. multiple threads of a single
application. I'm looking for suggestions on what Python modules
and/or techniques to use to maximize the number of processes I
can run on my system over a multi-day period.
Background: I have a large collection of data analysis scripts
that can run independently of one another. These scripts read
very large log files in a sequential manner and calculate
specific statistics. These scripts are hosted on a 8 core 64-bit
server with 48G of memory that is dedicated to running these
scripts. I am looking for a way to write a master script that
monitors system CPU and disk i/o and when there is capacity
launch another script from a pool of scripts so that I'm getting
max utilization from my hardware, eg. so that I'm running the max
number of scripts I can at any one point in time without
overwhelming the system.
The server in question is currently running Windows 2008
Enterprise, but I have the option to switch to a 64-bit version
of Linux if there are benefits to doing so.
Thank you,
Malcolmm
--
http://mail.python.org/mailman/listinfo/python-list

--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to