On 2/2/2014 5:40 AM, andrea crotti wrote:
2014-02-02 Terry Reedy <tjre...@udel.edu>:
On 2/1/2014 9:12 AM, andrea crotti wrote:

Comments:

The use is assert in the first slide seem bad in a couple of different
respects.


Why is it bad? It's probably not necessary but since we ask for a
range it might be good to check if the range is valid.
Maybe I should raise ValueError instead for a better exception?

In general, use assert (== AssertionError) to check program logic (should never raise). Remember that assert can be optimized away. Use other exceptions to check user behavior. So I believe that ValueError is appropriate here. I think I also questioned the particular check.

The use of 'gen_even' before it is defined.


Well this is because l'm saying that I wish I had something like this,
which I define just after. It might be confusing if it's not defined
but I thought it's nice to say what I would like to do and then
actually define it, what do you think?

In commenting on the slides, I did not know what you would say to supplement them.

A generator expression evaluates (better than 'yields') to a generator, not
just an iterator.


Ok thanks fixed

The definition of 'generator' copies the wrong and confused glossary entry.
Generator functions return generators, which are iterators with extra
behavior.


I understood instead that it was the opposite, a generator is a
specialized iterator,

'Generator functions', which you labeled 'generators', are functions, not iterators. The generators they return (and the generators that generator expressions evaluate to) are iterators, and more.

>>> type(a for a in 'abc')
<class 'generator'>

I am not sure whether 'specialized' or 'generalized' is the better term.


I would leave out For loop(2). The old pseudo-getitem iterator protocol is
seldom explicitly used any more, in the say you showed.

/say/way/

This was mainly to explain how something like
for el in [1, 2, 3]:
     print(el)

can work,

But it is no longer has that *does* work. All the builtin xyz collection classes have a corresponding xyz_iterator class with a __next__ method that knows how to sequentially access collection items. We do not normally see or think about them, but they are there working for us every time we do 'for item in xyz_instance:'

>>> [].__iter__()
<list_iterator object at 0x00000000035096A0>

In Python one could write the following:

class list_iterator:
  def __init__(self, baselist):
    self.baselist = baselist
    self.index = -1  # see __next__ for why
  def __iter__(self):
    return self
  def __next__(self):
    self.index += 1
    return self.baselist[self.index]

but the C version should use a static pointer into the object address array,

'Lazyness drawbacks' overflow_list is bizarre and useless.  overflow_gen is
bizarre and buggy. If you are intentionally writing buggy code to make a
point, label it as such on the slide.


Yes this is intentionally buggy. The thing is that I wanted to show
that sometimes generating things makes it harder to debug, and delays
some errors, which are anyway there but would come up immediately in
case of a list creation.
I could not find a better non artificial example for this, any
suggestion is welcome..

slide 1
---------
def recip_list(start, stop):
  lis []
  for i range(start, stop):
    list.append(1/i)
  return lis

for x in recip_list(-100, 3):  # fail here
  print x

<immediate traceback that include the for line>

slide 2
-------
def recip_gen(start, stop):
  for i in range(start, stop):
    yield 1/i


for x in recip_gen(-100, 3):
  print x  # fail here after printing 100 lines
...
<delayed traceback that omits for line with args that caused problem>

--
Terry Jan Reedy

--
https://mail.python.org/mailman/listinfo/python-list

Reply via email to