disk cache is shared among processes, so no problem there. uwsgi should use os.fork() to spawn a new process, so modules are definitively cached.
Didn't test with cache.ram (but if you want I may test it) because I use redis as cache in production. cache.ram will be "erased" when a "forced" shutdown of the process is required (this conf by default asks uwsgi to respawn a child process that fullfilled 2000 requests (the max-requests parameter)). redis and memcache obviously "solve the theoretic issue" of course (they are totally independant process from web2py). On Friday, January 25, 2013 10:17:33 PM UTC+1, Arnon Marcus wrote: > > Well, the way I currently understand this, is as follows: > Web2py uses execfile for most of it's work (models, controllers and > views), so no reload() is needed in production whenever a file get's > modified. > The flip-side of this, is that there can be no cache in these module > -files. > Then the ram/disk cache is for getting from these executed-modules to the > python process running web2py. But this is still assuming that there is a > single process. If nginx/uwsgi launches multiple processes of web2py, > than this cache will no longer be helpful - each time a process is > launched, the cache would have to be re-populated - this could mostly > mean doing database queries. That's a very bad thing for performance. > I don't know about web2py's implementation of memcached or about memcacheat > all, but I guess > its meant for solving that issue, right? > > > On Fri, Jan 25, 2013 at 1:00 PM, Paolo valleri > <paolo....@gmail.com<javascript:> > > wrote: > >> I am using memcached for caching cache.ram and cache.disk, the >> configuration is really easy, this is mine: >> if not request.is_local: >> from gluon.contrib.memcache import MemcacheClient >> memcache_servers = ['127.0.0.1:11211'] >> cache.memcache = MemcacheClient(request, memcache_servers) >> cache.ram = cache.disk = cache.memcache >> but I don't know how to understand the gain of using it. Any idea? >> >> paolo >> >> >> On Friday, January 25, 2013 9:51:22 PM UTC+1, Arnon Marcus wrote: >> >>> Oh, and what about memcache? >>> Can web2py benefit from it? Is there somewhere an explanation about this? >>> >>> >>> On Fri, Jan 25, 2013 at 12:48 PM, Arnon Marcus <a.m.m...@gmail.com>wrote: >>> >>>> 10x for clearing things out - you're right, I didn't do too much >>>> resource on uwsgi, and just assumed that it is, for nginx. whatmod_wsgi is >>>> for >>>> apache. >>>> So I guess I had it wrong. >>>> My current (soon to be "old") setup is running apache + mod_wsgi on >>>> windows 7, so I know all about the headaches that comes from setting this >>>> up... >>>> I would be more than glad to put apache behind me for good, if it >>>> would offer not performance improvements to this script's setup the way it >>>> does for php... >>>> >>>> On that note, how exactly is uwsgi handling web2py processes, as would >>>> be configured in this script? Is it easily customizable after the fact? >>>> Are there any any pros/cons for different scenarios that one should be >>>> aware of? >>>> >>>> >>>> >>>> On Fri, Jan 25, 2013 at 12:21 PM, Niphlod <nip...@gmail.com> wrote: >>>> >>>>> seems you missed a point.... uwsgi here is not a module, is an >>>>> executable that does one job and it does it well (actually, very well, >>>>> and >>>>> there's a lot of it that can be used that is outside the scope of this >>>>> script). >>>>> It could be used as a standalone highperformance webserver, but nginx >>>>> is placed in front of it to serve static files and to take care of Ddos >>>>> attacks. >>>>> >>>>> If you want to use apache behind nginx instead of uwsgi behind nginx >>>>> you're going basically to suffer wasted cpu, ram, a much harder to >>>>> maintain >>>>> config. >>>>> If you want to run python on apache because it's your default >>>>> webserver, than mod_wsgi is the way to go. Have to install apache just to >>>>> run python, it's only a waste of resources. >>>>> >>>>> -- >>>>> >>>>> >>>>> >>>>> >>>> >>>> >>> -- >> >> >> >> > > --