Hi,

I have a function that checks an uploaded file. I want the check to get 
scheduled automatically on upload but I also want admin user to be able to 
run the check live from the website (it doesn't take too long). I currently 
have a single function in a model that gets shared, but then it has to cope 
with the  differences in the runtime environment for a controller and for a 
scheduler worker.

One example is that  worker does not have access to the host name from the 
request environment, so  `URL(..., host=True)` provides localhost 
(127.0.0.1). That's easy to solve by loading the host name via AppConfig(), 
which is in both environments.

The other one is that a worker needs to run db.commit() to get the DAL to 
run any updates or inserts in the function. Easy enough to stick 
`db.commit()` in before returning from the function, but the controller 
will then also commit when it is run from the website. Is this a problem, 
either for overhead or for the database? Are there any other issues which 
mean I should keep the scheduler version and controller version separate?

Thanks,
David

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/d/optout.

Reply via email to