STINNER Victor added the comment:

Serhiy: "What if just deny reentrant reads? Set a flag while read into a 
buffer, check it before reading in other thread, and raise RuntimeError."

io.BufferedReader/io.BufferedWriter raises a RuntimeError exception for 
reentrant call, but only in the same thread. For example, it ease debug for 
signal handlers which trigger such reentrant call.

I'm not sure about 
0001-stop-crashes-when-iterating-over-a-file-on-multiple-.patch since it 
doesn't fix the consistency: two parallel readline() calls can return the same 
line, instead of being mutual exclusive and only return different lines.

I'm not sure about adding a new lock. "Lock" sounds like "dead locks". I 
dislike the risk of introducing dead locks very late in the Python 2.7 
development cycle.

I like the idea of a simple exception on concurrent operations. But I'm not 
sure how much code it will break :-/ Crazy idea: would it make sense to raise 
an exception by default, but add an opt-in option to ignore the exception? I 
wrote "crazy" since it became clear that the code is not thread-safe and so 
that parallel operations on the same file object is likely to corrupt data in 
various funny ways.

----------

_______________________________________
Python tracker <rep...@bugs.python.org>
<https://bugs.python.org/issue31530>
_______________________________________
_______________________________________________
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com

Reply via email to