If web2py does not leak memory (it did sometimes on development versions) the memory overhead is not very high compared to threads. The supplemental physical memory that each new web2py process, other than the parent process, requires is only a fraction of the virtual memory each process sees. This is due to the fact that each process shares those slices of memory that does not update (copy on write). The framework code is loaded in large part before spawing child processes so that part is mostly shared. Probably web2py's WSGI application can be improved to share more data among processes, but it does a good job already.
Performance Tip: It is very important to keep an eye on the memory usage to set a proper limit on the number of web2py processes: to have a good responsiveness it's better to have fewer processes than having their memory pages swapped. It is also important to check that the applications are compiled and that they do not use memory improperly such retaining large datasets in session, cache; with the DAL use limitby whenever possible. mic 2012/6/18 Massimo Di Pierro <massimo.dipie...@gmail.com>: > You need to have some kind of concurrency because if one user is doing, for > example, a download you do not want other users to wait. > > The built-in web server uses threads for concurrency but this reduces > efficiency. Some servers use lightweight threads with async sockets. They > are the most efficient but only use one interpreter and therefore one core > at the time (unless you explicitly use processes). Using processes is the > best way to take advantage of multiple cores. Which combination you use > really depends on the architectures. > > On languages which are not interpreted things are easier because you just > use threads and you get scalability and concurrency at the same time. Not in > Ruby or Python unfortunately. > > Massimo > > > On Monday, 18 June 2012 16:17:39 UTC-5, mrtn wrote: >> >> >> Thanks Massimo. So what're the pros of having multiple threads in the >> context of web2py, if any? Asynchronous processing? >> >> On Monday, 18 June 2012 15:19:03 UTC-4, Massimo Di Pierro wrote: >>> >>> There are pros and cons. If you use threads in a python program, the more >>> computing cores you have, the slower - not faster - the program gets. This >>> is a python feature because even if you have threads, there is only one >>> interpreted and therefore execution is serialized anyway. For scalability >>> you should have processes (not threads) one per core. >>> >>> On Monday, 18 June 2012 11:34:39 UTC-5, mrtn wrote: >>>> >>>> >>>> I'm deploying web2py with uwsgi 1.2.3, and upon starting the app, there >>>> is message in the uwsgi log says; >>>> >>>> *** Python threads support is disabled. You can enable it with >>>> --enable-threads *** >>>> >>>> And in the uwsgi doc, it says the following for option enable-thread: >>>> >>>> Enable threads in the embedded languages. This will allow to spawn >>>> threads in your app. >>>> >>>> I wonder if I really need this enabled for web2py. If so, what would be >>>> a common use case for it. Thanks. >>>> >>>> >