Tim Peters wrote: > [Claudio Grondi] > >> Here an example of what I mean >> (Python 2.4.2, IDLE 1.1.2, Windows XP SP2, NTFS file system, 80 GByte >> large file): >> >> >>> f = file('veryBigFile.dat','r') >> >>> f = file('veryBigFile.dat','r+') >> >> Traceback (most recent call last): >> File "<pyshell#1>", line 1, in -toplevel- >> f = file('veryBigFile.dat','r+') >> IOError: [Errno 2] No such file or directory: 'veryBigFile.dat' >> >> Is it a BUG or a FEATURE? > > > Assuming the file exists and isn't read-only, I bet it's a Windows > bug, and that if you open in binary mode ("r+b") instead I bet it goes > away (this wouldn't be the first large-file text-mode Windows bug).
I knew already that 'r+b' fixes it. Yes, you have won the bet :) . I suppose, like you do, that because there is a difference between text and binary files on Windows and the text files are e.g. opened being buffered using a 32-bit buffer pointer, this fails on too large NTFS files. I could also imagine that Python tries to buffer the text file and fails because it uses the wrong pointer size when asking Windows for the content. I have not yet looked into the C-code of Python - any hint which file I should take a closer look at? Just curious to see for myself, that the bug is on the Windows side. Claudio Grondi -- http://mail.python.org/mailman/listinfo/python-list