On Apr 27, 2020, at 12:49, Soni L. <[email protected]> wrote:
>
> I wanna propose making generators even weirder!
Why? Most people would consider that a negative, not a positive. Even if you
demonstrate some useful functionality with realistic examples that benefit from
it, all you’ve done here is set the bar higher for yourself to convince anyone
that your change is worth it.
> so, extended continue is an oldie:
https://www.python.org/dev/peps/pep-0342/#the-extended-continue-statement
>
> it'd allow one to turn:
>
> yield from foo
>
> into:
>
> for bar in foo:
> continue (yield bar)
And what’s the advantage of that? It’s a lot more verbose, harder to read,
probably easier to get wrong, and presumably less efficient. If this is your
best argument for why we should revisit an old rejected idea, it’s not a very
good one.
(If you’re accepting that it’s a pointless feature on its own but proposing it
because, together with your other proposed new feature, it would no longer be
pointless, then say that, don’t offer an obviously bad argument for it on its
own.)
> but what's this extended for-else? well, currently you have for-else:
>
> for x, y, z in zip(a, b, c):
> ...
> else:
> pass
>
> and this works. you get the stuff from the iterators, and if you break the loop, the else doesn't run. the else basically behaves like "except StopIteration:"...
>
> so I propose an extended for-else, that behaves like "except StopIteration as foo:". that is, assuming we could get a zip() that returns partial results in the StopIteration (see other threads), we could do:
>
> for x, y, z in zip(a, b, c):
> do_stuff_with(x, y, z)
> else as partial_xy:
> if len(partial_xy) == 0:
> x = dummy
> try:
> y = next(b)
> except StopIteration: y = dummy
> try:
> z = next(c)
> except StopIteration: z = dummy
> if (x, y, z) != (dummy, dummy dummy):
> do_stuff_with(x, y, z)
> if len(partial_xy) == 1:
> x, = partial_xy
> y = dummy
> try:
> z = next(c)
> except StopIteration: z = dummy
> do_stuff_with(x, y, z)
> if len(partial_xy) == 2:
> x, y = partial_xy
> z = dummy
> do_stuff_with(x, y, z)
>
> (this example is better served by zip_longest. however, it's nevertheless a good way to demonstrate functionality, thanks to zip_longest's (and zip's) trivial/easy to understand behaviour.)
Would it always be this complicated and verbose to use this feature? I mean,
compare it to the “roughly equivalent” zip_longest in the docs, which is a lot
shorter, easier to understand, harder to get wrong, and more flexible (e.g., it
works unchanged with any number of iterables, while yours to had to rewritten
for any different number of iterables because it requires N! chunks of explicit
boilerplate).
Are there any examples where it lets you do something useful that can’t be done
with existing features, so it’s actually worth learning this weird new feature
and requiring Python 3.10+ and writing 22 lines of extra code?
Even if there is such an example, if the code to deal with the post-for state
is 11x as long and complicated as the for loop and can’t be easily simplified
or abstracted, is the benefit of using a for loop instead of manually nexting
iterators still a net benefit? I don’t know that manually nexting the iterators
will always avoid the problem, but it certainly often is (again, look at many
of the equivalents in the itertools docs that do it), and it definitely is in
your emulating-zip_longest example, and that’s the only example you’ve offered.
Also notice that many cases like this can be trivially solved by a simple
peekable or unnextable (I believe more-itertools has both, and the first one is
a recipe in itertools too, but I can’t remember the names they use; if not,
they’re really easy to write) or tee. We don’t even need any of that for your
example, but if you can actually come up with another example, make sure it
isn’t already doable a lot more simply with peekable/etc.
> this would enable one to turn:
>
> return yield from foo
>
> into:
>
> for bar in foo:
> continue (yield bar)
> else as baz:
> return baz
>
> allowing one to pick apart and modify the yielded and sent parts, while still getting access to the return values.
Again, this is letting you turn something simple into something more
complicated, and it’s not at all clear why you want to do that. What exactly
are you trying to pick apart that makes that necessary, that can’t be written
better today?
I’ll grant that writing something fully general that supports all the different
things that could be theoretically done with your desired feature requires the
ugly mess that you posted below. (I’m not sure it’s true, but it seems at least
possible, so let’s go with it.) But that doesn’t mean that writing something
appropriate for any particular realistic example requires that. And without
seeing any such examples, or even getting a vague description of them, nobody
has any reason to believe there’s an actual problem to be solved for any of
them.
While we’re at it: what does else as do when you’re looping over a sequence
until IndexError instead of looping over an iterator? What does it do on while
loops? What does it do even on for loops over an iterator whose StopIteration
has no value?
Also, unless you have some additional proposal that you haven’t mentioned here,
it seems like as soon as you add any further transformation after the zip (a
comprehension, many uses of map, most functions out of itertools or a
third-party library, …) you’ve either lost the magic StopIteration value or
made it incorrect. To demonstrate what I mean:
def gen():
yield 1
return 2
it = gen()
next(it) # yields 1
next(it) # raises StopIteration(2)
it = (x*3 for x in gen())
next(it) # yields 1*3 == 3
next(it) # raises StopIteration()
it = chain([0], gen())
next(it) # yields 0
next(it) # yields 1
next(it) # raises StopIteration()
Most places where you need to do fancy stuff with iteration, you expect to be
able to transform iterators in ways like this without breaking everything. But
your new feature won’t allow that. It can only be used if you’re directly
iterating on the result of zip. Which seems to imply there probably aren’t any
good use cases at all other than ones you can already handle better with
zip_longest, in which case what’s the point of any of this?