May I ask why are you storing the csv file rows in the database if what you want is to send it to the webservice for processing?
If there are only a dozen or so users at a time this could easily fit in memory. I would store everything in cache ram with the session_id as key and delete it after processing or after a certain time was elapsed. This should speed up your processing by several orders of magnitude. That said, sending 20 rows at a time to the webservice when the files have many thousands of rows, will always be extremely slow. At the very least, you will have latency, tcp connection establishing, http connection establishing, etc. per call. All that adds up to a huge amount of time. I would say that this webservice is not adequate for your objectives if this being slow is a problem. You would need a webservice that would let you send much more rows at a time. I am assuming that the webservice is not part of your application, and that your application is not between the ajax calls in the client and the webservice. So there's not much optimization for you to do there. -- Resources: - http://web2py.com - http://web2py.com/book (Documentation) - http://github.com/web2py/web2py (Source code) - https://code.google.com/p/web2py/issues/list (Report Issues) --- You received this message because you are subscribed to the Google Groups "web2py-users" group. To unsubscribe from this group and stop receiving emails from it, send an email to [email protected]. For more options, visit https://groups.google.com/d/optout.

