On Thu, 2008-12-04 at 20:01 +0100, Дамјан Георгиевски wrote: > > I don't think it matters. Here's a quick comparison between 2.5 and > > 3.0 on a relatively small 17 meg file: > > > > C:\>c:\Python30\python -m timeit -n 1 > > "open('C:\\work\\temp\\bppd_vsub.csv', 'rb').read()" > > 1 loops, best of 3: 36.8 sec per loop > > > > C:\>c:\Python25\python -m timeit -n 1 > > "open('C:\\work\\temp\\bppd_vsub.csv', 'rb').read()" > > 1 loops, best of 3: 33 msec per loop > > > > That's 3 orders of magnitude slower on python3.0! > > Isn't this because you have the file cached in memory on the second run?
Even on different files of identical size it's ~3x slower: $ dd if=/dev/urandom of=file1 bs=1M count=70 70+0 records in 70+0 records out 73400320 bytes (73 MB) copied, 14.8693 s, 4.9 MB/s $ dd if=/dev/urandom of=file2 bs=1M count=70 70+0 records in 70+0 records out 73400320 bytes (73 MB) copied, 16.1581 s, 4.5 MB/s $ python2.5 -m timeit -n 1 'open("file1", "rb").read()' 1 loops, best of 3: 5.26 sec per loop $ python3.0 -m timeit -n 1 'open("file2", "rb").read()' 1 loops, best of 3: 14.8 sec per loop -- http://mail.python.org/mailman/listinfo/python-list