In article <[EMAIL PROTECTED]>, Pradip <[EMAIL PROTECTED]> wrote:
> Hello every body. I am new to this forum and also in Python. > Read many things about multi threading in python. But still having > problem. > > I am using Django Framework with Python having PostgreSQL as backend > database with Linux OS. My applications are long running. I am using > threading. > The problem I am facing is that the connections that are being created > for database(postgres) update are not getting closed even though my > threads had returned and updated database successfully. It is not like > that the connections are not being reused. They r being reused but > after sometime new one is created. Like this it creates too many > connections and hence exceeding MAX_CONNECTION limit of postgres conf. > > ** I am using psycopg2 as adaptor for python to postgres connection. > which itself handles the connections(open/close) Hi Pradip, A common problem that new users of Python encounter is that they expect database statements to COMMIT automatically. Psycopg2 follows the Python DB-API specification and does not autocommit transactions unless you ask it to do so. Perhaps your connections are not closing because they have open transactions? To enable autocommit, call this on your connection object: connection.set_isolation_level(psycopg2.extensions.ISOLATION_LEVEL_AUTOCO MMIT) > Now the problem is with Django / Python / psycopg2 or any thing else?? Are you asking if there are bugs in this code that are responsible for your persistent connections? If so, then I'd say the answer is almost certainly no. Of course it's possible, but Django/Psycopg/Postgres is a pretty popular stack. The odds that there's a major bug in this popular code examined by many eyes versus a bug in your code are pretty low, I think. Don't take it personally, the same applies to my me and my code. =) Happy debugging -- Philip http://NikitaTheSpider.com/ Whole-site HTML validation, link checking and more -- http://mail.python.org/mailman/listinfo/python-list