A big problem is that the request from your client starts a thread which is supposed to have a very short life (being a WSGI webserver). This thread is the parent to any children (processes or other threads) and when the parent dies, child processes and threads will too. This is actually the way that python's multiprocessing works by default: programs that spawn threads or processes are supposed to have a co-ordinator and children without parents are cleaned-up. So you either have to block the request from returning until you've got the 20 images, or if you let the request work the way it is supposed to, your child processes or threads will be terminated. You've found a work around, by implementing a primitive version of the scheduler. but it's very forced. This is why it is never the solution for web2py.
You say you need to serve the images to the client. I guess this means in a browser. I've had a similar problem with embedded youtube videos, where the served image is a thumbnail. I used ajax, ie put the image locations into a individual LOADs so that the client (browser) sends n parallel requests to the server; effectively the image requests are now running in parallel with the browser playing the role of the co-ordinating entity. One way or another the client has to wait until you have got your 20 images so on face value this doesn't sound like it will cost anything and the user gets to see interim results. It sounds like you can't predict the required images in advance. If you could, then the scheduler would work. On Tuesday, November 4, 2014 5:03:23 AM UTC+11, Josh L wrote: > > Unfortunately after lots of experimentation I wasn't able to get the > multiprocessing module to work with web2py. What I did find however was > that I could use subprocess.check_output to launch a Python script > containing the multiprocessing module and a pool of workers, and I can get > data to the script by passing it command line arguments. The check_output > function returns script results "print"ed to stdout which I could then > parse back in the web2py process and use to update my database. So it looks > like multiprocessing can work as long as you use it in an external script > running in its own instance of python. Hope this helps someone! > -- Resources: - http://web2py.com - http://web2py.com/book (Documentation) - http://github.com/web2py/web2py (Source code) - https://code.google.com/p/web2py/issues/list (Report Issues) --- You received this message because you are subscribed to the Google Groups "web2py-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to web2py+unsubscr...@googlegroups.com. For more options, visit https://groups.google.com/d/optout.