[Guido]
> ,,,
> I learned something in this thread -- I had no idea that the deque datatype
> even has an option to limit its size (and silently drop older values as new
> ones are added), let alone that the case of setting the size to zero is
> optimized in the C code. But more importantly, I don't think I've ever needed
> either of those features, so maybe I was better off not knowing about them?
I was aware of both, but never used maxsize=0. It appeals, I guess,
for much the same reason it's sometimes convenient to throw away
output in a Unixy shell just by redirecting to /dev/null. I write the
`for` loop instead because it's clear at once.
Non-zero sizes do have real uses, obviously so when working with
linear recurrences. For example, a Fibonacci generator:
def fib():
from collections import deque
d = deque([0, 1], 2)
yield from d
while True:
c = sum(d)
yield c
d.append(c)
Of course the deeper the recurrence, the more pleasant this is than
hoping not to make a subtle typo when shifting N variables "by hand".
> Have we collectively been nerd-sniped by an interesting but unimportant
> problem?
Pretty much ;-) Here's another: what's the fastest way to get a
Python loop to go around N times?
Bingo:
for _ in itertools.repeat(None, N):
No dynamic memory churn: no new objects created per iteration. not
even under the covers (the implementation uses a native C ssize_t to
hold the remaining iteration count).
itertools is the answer to every question ;-)
_______________________________________________
Python-ideas mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at
https://mail.python.org/archives/list/[email protected]/message/3V4WX4KTDQEB624GJTXPSW7Y5HD3BSJ3/
Code of Conduct: http://python.org/psf/codeofconduct/