I'm somewhat baffled by an issue that came up when testing the scheduler. 
BTW, I'm on Windows for this test, but someone reported the same problem 
for unix/mac too.
Maybe someone more experienced than me can explain this.
>From my understanding, the DBAPI for python allows me to:
- have a "consumer" process reading the data on some table
- have another "producer" process inserting data and then committing it
- the next round the "consumer" reads the table, the data inserted by 
"producer" is readable (and fetchable)

This is working with SQLite, Postgresql, MSSQL but not for MySQL (at least 
on my machine). Don't know what's going on.

Steps to reproduce:
consumer.py
from gluon import DAL, Field
import time
if __name__ == '__main__':
    db = DAL('mysql://....')
    db.define_table('testingtable',
        Field('testcol'))
    for a in range(1000):
        print a, db(db.testingtable.id>0).count()
        #db.commit()
        time.sleep(2)

producer.py

from gluon import DAL, Field
import time
if __name__ == '__main__':
    db = DAL('mysql://....')
    db.define_table('testingtable',
        Field('testcol'))
    for a in range(1000):
        print a, db.testingtable.insert(testcol=a)
        db.commit()
        time.sleep(2)

Starting both scripts and watching the output, I end up having the consumer 
not seeing the inserted (and committed) rows from the consumer.

All seems to work as intended if the db.commit() line is uncommented in the 
consumer.py script.

Now, have I completely missed the DBAPI implementation or MySQL driver 
works differently ? (some kind of transaction-isolation issue maybe ?) 

-- 



Reply via email to