I ran into this a few nights ago, and if I wasn't about to leave on a
vacation I'd try some prototype code for it, but from what I can tell
at the end of any request all open database connections are marked to
be closed (Except for those needed to keep the connection pool up to
it's proper size.)   What I'd propose is as an extension you can pass
a dontclose flag to the constructor of the DAL that if set will bypass
this keep alive.  The downside is for my intended use, the user of
that DAL would also need to manage the lifetime of the transaction.
However, since the code exists in a module that may have a lifetime
beyond that of a single request, that shouldn't be considered a bad
thing.

Are there any thoughts on adding an optional argument to not call
connection.close() in close_all_instances if some property is set?
(And yes I know about custom_commit / custom_rollback)

On Wed, Jun 9, 2010 at 1:38 PM, Jose <jjac...@gmail.com> wrote:
>
> As I mentioned, with the object caching works well.
> When I create the object you pass as argument the db. The problem I
> have is that when I save something in the database fails me because
> the base is closed.
>
> ...
> File "/usr/home/jose/web2py/applications/py_ccu/modules/
> incidencias.py", line 124, in migrar_archivo
>    self._db(qry_localidad).delete()
>  File "/usr/home/jose/web2py/gluon/sql.py", line 3263, in delete
>    self._db._execute(query)
>  File "/usr/home/jose/web2py/gluon/sql.py", line 899, in <lambda>
>    self._execute = lambda *a, **b: self._cursor.execute(*a, **b)
> ProgrammingError: Cannot operate on a closed database.
>
> Not because it closes, as in the __init__ run multiple queries and I
> have no problems.
>
> It reminds me error when I use the web shell. Will be about the same?
>
> Jose

Reply via email to