[EMAIL PROTECTED] wrote: > I am using cx_Oracle and MySQLdb to pull a lot of data from some tables > and I find that the cursor.execute method uses a lot of memory that > never gets garbage collected. Using fetchmany instead of fetchall does > not seem to make any difference, since it's the execute that uses > memory. [...]
For MySQLdb, the SSCursor class ("Server Side Cursor"), rather than the default cursor class, may do what you want: retrieve the result set row-by-row on demand, rather than all at once at the time of .execute(). You'll need to remember to .fetch...() every row and call .close(), however. See the docstrings for CursorUseResultMixIn and CursorStoreResultMixIn classes in MySQLdb.cursors for more information. > [...] Breaking the query down to build lots of small tables doesn't > help, since execute doesn't give its memory back, after reading enough > small tables execute returns a memory error. What is the trick to get > memory back from execute in cx_Oracle and MySQLdb? Generally, the trick is to avoid consuming the memory in the first place. :) Regards, Mike -- http://mail.python.org/mailman/listinfo/python-list