Did you remember to do a commit at each round?

A long running script / daemon can do:

db=DAL...
<define tables>

while True:
   <do stuff with DAL>
   db.commit()
   sleep(1000) # <- just for exampl i.e. don't do anything for a long time

The DAL pushes the connection "back to sleep in the pool" [ :-) ] when
there is a commit.
When one starts operating on the DAL instance (db) a connection is  taken
from the pool and *tested* for being functioning.
If the connection is dead it's replaced by a new one.  Then a transaction
is started on that connection.

 mic


2013/9/24 Ricardo Cárdenas <ricardo.carde...@gmail.com>

> Derek, thanks for your suggestion. Which is the preferred way to reopen
> the connection and define the tables?
>
> My db.py contains
>
> db = DAL(connection_string, options...)
> db.define_table('table1', Field('f1' ...), Field('f2'...))
> db.define_table(...)
> db.define_table(...)
> ...
>
> So ideally I wouldn't have to repeat this code in my script. Is it best I
> break out the DAL/define_tables calls into a file that I import both in my
> db.py and in my script.py?
>
>
>
> On Tuesday, September 24, 2013 4:45:09 PM UTC-5, Derek wrote:
>>
>> Each time you need to do work, you should open a new connection. You'd
>> think there is a large overhead in creating a connection, but there isn't.
>>
>> On Tuesday, September 24, 2013 2:33:24 PM UTC-7, Ricardo Cárdenas wrote:
>>>
>>> I have a web2py app running fine on pythonanywhere. I have a minor
>>> problem - I think I understand why it is happening, but would seek your
>>> advice as to how best to fix it.
>>>
>>> The app itself works fine. But I also run a scheduled task using PA's
>>> scheduler, by executing "python web2py.py -S appname -M -R
>>> appname/private/myscript.py". The script does some processing for a few
>>> minutes, and only then starts writing to the MySQL database.
>>>
>>> PA's MySQL database has wait_timeout set to 120 seconds. If my initial
>>> processing is less than wait_timeout, everything works fine. But when
>>> my initial processing exceeds wait_timeout, I get a 'Lost Connection to
>>> MySQL' error when my code tries to write to the database. I am using
>>> connection pooling in the call to the DAL, but I guess the connection
>>> instantiated by db.py file is not automatically kept warm nor is it
>>> automatically replaced by another good connection when it expires.
>>>
>>> What's the best practice here:
>>>
>>>    - Is there a preferred way to ping the database every once in a
>>>    while?
>>>    - Is there a preferred way to detect an expired DAL connection, and
>>>    to request another one?
>>>
>>> Thanks for any suggestions or pointers. Sorry if this is in the docs,
>>> couldn't find it. warm regards -Ricardo
>>>
>>>  --
> Resources:
> - http://web2py.com
> - http://web2py.com/book (Documentation)
> - http://github.com/web2py/web2py (Source code)
> - https://code.google.com/p/web2py/issues/list (Report Issues)
> ---
> You received this message because you are subscribed to the Google Groups
> "web2py-users" group.
> To unsubscribe from this group and stop receiving emails from it, send an
> email to web2py+unsubscr...@googlegroups.com.
> For more options, visit https://groups.google.com/groups/opt_out.
>

-- 
Resources:
- http://web2py.com
- http://web2py.com/book (Documentation)
- http://github.com/web2py/web2py (Source code)
- https://code.google.com/p/web2py/issues/list (Report Issues)
--- 
You received this message because you are subscribed to the Google Groups 
"web2py-users" group.
To unsubscribe from this group and stop receiving emails from it, send an email 
to web2py+unsubscr...@googlegroups.com.
For more options, visit https://groups.google.com/groups/opt_out.

Reply via email to