I see. The DBAPI spec is created to smooth the differences between 
different implementation: that's why , for example, if a driver is 
"defaulting" to autocommit (like pyodbc), the underlying DBAPI forces off 
the "autocommit".

I'm really not well-versed into "following the PEP" but I think behaviour 
of pymysql brakes the DBAPI spec. If it does not, we should add to the DAL 
the necessary "read-committed" instruction to make pymysql behave like 
every other adapter.

On Tuesday, August 7, 2012 9:35:15 AM UTC+2, Marin Pranjić wrote:
>
> Default isolation level for mysql is 'Repeatable Read'.
> It does not see commits from other transactions.
> You can use db.commit to update table snapshot or you can change isolation 
> level with db.executesql at the beginning of transaction. 'Read Committed' 
> level is what you need.
>
> Postgres & MSSQL have 'Read Committed' by default.
> Not sure about sqlite but if it works then it works :)
>
> Marin
>
> Dana ponedjeljak, 6. kolovoza 2012. 22:08:37 UTC+2, korisnik Niphlod 
> napisao je:
>>
>> I'm somewhat baffled by an issue that came up when testing the scheduler. 
>> BTW, I'm on Windows for this test, but someone reported the same problem 
>> for unix/mac too.
>> Maybe someone more experienced than me can explain this.
>> From my understanding, the DBAPI for python allows me to:
>> - have a "consumer" process reading the data on some table
>> - have another "producer" process inserting data and then committing it
>> - the next round the "consumer" reads the table, the data inserted by 
>> "producer" is readable (and fetchable)
>>
>> This is working with SQLite, Postgresql, MSSQL but not for MySQL (at 
>> least on my machine). Don't know what's going on.
>>
>> Steps to reproduce:
>> consumer.py
>> from gluon import DAL, Field
>> import time
>> if __name__ == '__main__':
>>     db = DAL('mysql://....')
>>     db.define_table('testingtable',
>>         Field('testcol'))
>>     for a in range(1000):
>>         print a, db(db.testingtable.id>0).count()
>>         #db.commit()
>>         time.sleep(2)
>>
>> producer.py
>>
>> from gluon import DAL, Field
>> import time
>> if __name__ == '__main__':
>>     db = DAL('mysql://....')
>>     db.define_table('testingtable',
>>         Field('testcol'))
>>     for a in range(1000):
>>         print a, db.testingtable.insert(testcol=a)
>>         db.commit()
>>         time.sleep(2)
>>
>> Starting both scripts and watching the output, I end up having the 
>> consumer not seeing the inserted (and committed) rows from the consumer.
>>
>> All seems to work as intended if the db.commit() line is uncommented in 
>> the consumer.py script.
>>
>> Now, have I completely missed the DBAPI implementation or MySQL driver 
>> works differently ? (some kind of transaction-isolation issue maybe ?) 
>>
>

-- 



Reply via email to