John Machin wrote:
Background: There was/is a very recent thread about ways of removing
all instances of x from a list. /F proposed a list comprehension to
build the result list. Given a requirement to mutate the original list,
this necessitates the assignment to lst[:]. I tried a generator
expression as well. However while the listcomp stayed competitive up to
a million-element list, the genexp went into outer space, taking about
20 times as long. The above timeit runs show a simpler scenario where
the genexp also seems to be going quadratic.
Comments, clues, ... please.

Py> lc = [x for x in range(100)] Py> len(lc) 100 Py> ge = (x for x in range(100)) Py> len(ge) Traceback (most recent call last): File "<stdin>", line 1, in ? TypeError: len() of unsized object

It would be nice if unconditional ge's with known length inputs propagated __len__, but that is not currently the case. There's a similar performance glitch associated with constructing a tuple from a generator expression (with vanilla 2.4, detouring via list is actually faster)

Cheers,
Nick.

--
Nick Coghlan   |   [EMAIL PROTECTED]   |   Brisbane, Australia
---------------------------------------------------------------
            http://boredomandlaziness.skystorm.net
--
http://mail.python.org/mailman/listinfo/python-list

Reply via email to