reetesh nigam wrote: > Hi All, > > I am retrieving data from Sybase database using Sybase module of Python. > My query is not taking time however fecthall is taking longer time. > > Below is the test script : > > def run_query(db,query): ## Run query and resturn record result > t1 = datetime.now() > cursorObj = db.cursor() > t2 = datetime.now() > cursorObj.execute(query) > t3 = datetime.now() > import pdb > pdb.set_trace() > rowset = cursorObj.fetchall() > t4 = datetime.now() > cursorObj.close() > print "Time taken to make cursor --%s"%(t2-t1) > print "Time taken to execute query --%s"%(t3-t2) > print "Time taken for fetchall--%s"%(t4-t3) > return rowset > > Output: > Time taken to make cursor --0:00:00.000037 > Time taken to execute query --0:00:00.379443 > Time taken for fetchall--0:00:14.739064
fetchall() probably has to transfer a lot of rows. If you want to process them one at a time you can turn run_query into a generator def run_query(db, query): ... while True: row = cursor.fetchone() if row is None: break yield row ... A complete implementation that guarantees that the cursor is closed might look like this (untested): @contextlib.contextmanager def run_query(db, query): cursor = db.cursor() try: cursor.execute(query) yield iter(cursor.fetchone, None) finally: cursor.close() # use it with run_query(db, query) as rows: for row in rows: print row This is likely to *increase* the overall time taken, but should drastically reduce the time you have to wait for the first record to be printed, i. e. the latency. -- https://mail.python.org/mailman/listinfo/python-list