>> > Note that every time you see [x for x in ...] with no condition, you >> can >> > write list(...) instead - more clear, and faster. >> > >> > data = list(csv.reader(open('some.csv', 'rb'))) >> >> Faster? No. List Comprehensions are faster. > > [EMAIL PROTECTED] pdfps $ python -m timeit -c 'data = list(open("make.ps"))' > 100 loops, best of 3: 7.5 msec per loop > [EMAIL PROTECTED] pdfps $ python -m timeit -c 'data = [line for line in > open("make.ps")]' > 100 loops, best of 3: 9.2 msec per loop > > On my system just putting into a list is faster. I think this is > because you don't need to assign each line to the variable 'line' each > time in the former case. > > I, too, think it's faster to just use list() instead of 'line for line > in iterable', as it seems kind of redundant. >
$ python -m timeit -c 'import csv; data = list(csv.reader(open("some.csv", "rb")))' 10000 loops, best of 3: 44 usec per loop $ python -m timeit -c 'import csv; data = [row for row in csv.reader(open("some.csv", "rb"))]' 10000 loops, best of 3: 37 usec per loop I don't know why there seems to be a differece, but I know that list comps are python are very heavily optimised. -- http://mail.python.org/mailman/listinfo/python-list