Re: [Python-Dev] Please reconsider PEP 479.

2014-11-27 Thread Nick Coghlan
On 27 November 2014 at 11:15, Guido van Rossum  wrote:
> On Wed, Nov 26, 2014 at 2:53 PM, Nick Coghlan  wrote:
>>
>> On 27 Nov 2014 06:35, "Guido van Rossum"  wrote:
>>
>> [...]
>>
>> > I think we can put a number to "much faster" now -- 150 nsec per
>> > try/except.
>> >
>> > I have serious misgivings about that decorator though -- I'm not sure
>> > how viable it is to pass a flag from the function object to the execution
>> > (which takes the code object, which is immutable) and how other Python
>> > implementations would do that. But I'm sure it can be done through sheer
>> > willpower. I'd call it the @hettinger decorator in honor of the PEP's most
>> > eloquent detractor. :-)
>>
>> I agree with everything you wrote in your reply, so I'll just elaborate a
>> bit on my proposed implementation for the decorator idea.
>
> This remark is  ambiguous -- how strongly do you feel that this decorator
> should be provided? (If so, it should be in the PEP.)

I think it makes sense to standardise it, but something like
"itertools.allow_implicit_stop" would probably be better than having
it as a builtin. (The only reason I suggested a builtin initially is
because putting it in itertools didn't occur to me until later)

Including the decorator provides a straightforward way to immediately
start writing forward compatible code that's explicit about the fact
it relies on the current StopIteration handling, without being
excessively noisy relative to the status quo:

# In a module with a generator that relies on the current behaviour
from itertools import allow_implicit_stop

@allow_implicit_stop
def my_generator():
...
yield next(itr)
...

In terms of code diffs to ensure forward compatibility, it's 1 import
statement per affected module, and 1 decorator line per affected
generator, rather than at least 3 lines (for try/except/return) plus
indentation changes for each affected generator. That's a useful
benefit when it comes to minimising the impact on version control code
annotation, etc.

If compatibility with older Python versions is needed, then you could
put something like the following in a compatibility module:

try:
from itertools import allow_implicit_stop
except ImportError:
# Allowing implicit stops is the default in older versions
def allow_implicit_stop(g):
return g

Regards,
Nick.

-- 
Nick Coghlan   |   [email protected]   |   Brisbane, Australia
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 479: Change StopIteration handling inside generators

2014-11-27 Thread Nick Coghlan
On 27 November 2014 at 09:50, Guido van Rossum  wrote:
> On Wed, Nov 26, 2014 at 3:15 PM, Nick Coghlan  wrote:
>> This is actually the second iteration of this bug: the original
>> implementation *always* suppressed StopIteration. PJE caught that one before
>> Python 2.5 was released, but we didn't notice that 3.3 had brought it back
>> in a new, more subtle form :(
>>
>> It's worth noting that my "allow_implicit_stop" idea in the other thread
>> wouldn't affect subgenerators - those would still convert StopIteration to
>> RuntimeError unless explicitly silenced.
>
> You've lost me in this subthread. Am I right to conclude that the PEP change
> doesn't cause problems for contextlib(*), but that the PEP change also
> probably wouldn't have helped diagnose any contextlib bugs?

I think the PEP 479 semantics would have made both bugs (the one PJE
found in 2.5, and the newer one Isaac pointed out here) less cryptic,
in that they would have caused RuntimeError to be raised, rather than
silently consuming the StopIteration and continuing execution after
the with statement body.

With the new semantics, contextlib just needs to be updated to cope
with the StopIteration -> RuntimeError conversion, and Isaac's
"spurious success" bug will be fixed*.

Without PEP 479, I believe my only recourse to systematically
eliminate the risk of generator based context managers silently
consuming StopIteration would be to implement the "StopIteration ->
gen.close()" workaround, and that would be a backwards incompatible
change in its own right.

Cheers,
Nick.

P.S. *(This does mean I was wrong about allow_implicit_stop being
useful to contextlib, but I still think the decorator is useful for
cases where StopIteration is being used to terminate the generator on
purpose)

-- 
Nick Coghlan   |   [email protected]   |   Brisbane, Australia
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Strange "help(int.__lt__)". Probably documentation bug

2014-11-27 Thread Jesus Cea
http://bugs.python.org/issue20530#msg231584

-- 
Jesús Cea Avión _/_/  _/_/_/_/_/_/
[email protected] - http://www.jcea.es/ _/_/_/_/  _/_/_/_/  _/_/
Twitter: @jcea_/_/_/_/  _/_/_/_/_/
jabber / xmpp:[email protected]  _/_/  _/_/_/_/  _/_/  _/_/
"Things are not so easy"  _/_/  _/_/_/_/  _/_/_/_/  _/_/
"My name is Dump, Core Dump"   _/_/_/_/_/_/  _/_/  _/_/
"El amor es poner tu felicidad en la felicidad de otro" - Leibniz



signature.asc
Description: OpenPGP digital signature
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Strange "help(int.__lt__)". Probably documentation bug

2014-11-27 Thread Victor Stinner
2014-11-27 13:28 GMT+01:00 Jesus Cea :
> http://bugs.python.org/issue20530#msg231584

Copy/paste of the message:

Preparing a presentation about Python Magic methods I found something
weird: (Python 3.4)

"""
>>> help(int.__lt__)
Help on wrapper_descriptor:

__lt__(self, value, /)  <- THIS!!
Return selfhttps://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Strange "help(int.__lt__)". Probably documentation bug

2014-11-27 Thread Victor Stinner
2014-11-27 13:41 GMT+01:00 Victor Stinner :
> 2014-11-27 13:28 GMT+01:00 Jesus Cea :
>> http://bugs.python.org/issue20530#msg231584
>
> Copy/paste of the message:
>
> Preparing a presentation about Python Magic methods I found something
> weird: (Python 3.4)
>
> """
 help(int.__lt__)
> Help on wrapper_descriptor:
>
> __lt__(self, value, /)  <- THIS!!
> Return self """
>
> I am amused about the "/)" suffix in the signature. It happens to all
> magic methods.
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Strange "help(int.__lt__)". Probably documentation bug

2014-11-27 Thread Victor Stinner
2014-11-27 13:41 GMT+01:00 Victor Stinner :
> I am amused about the "/)" suffix in the signature. It happens to all
> magic methods.

If I remember correctly, it means  that the function does not accept keywords:

>>> (3).__lt__(4)
True
>>> (3).__lt__(value=4)
Traceback (most recent call last):
  File "", line 1, in 
TypeError: wrapper __lt__ doesn't take keyword arguments

Victor
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Strange "help(int.__lt__)". Probably documentation bug

2014-11-27 Thread Jesus Cea
On 27/11/14 13:42, Victor Stinner wrote:
> 2014-11-27 13:41 GMT+01:00 Victor Stinner :
>> I am amused about the "/)" suffix in the signature. It happens to all
>> magic methods.
> 
> If I remember correctly, it means  that the function does not accept keywords:

I don't understand. Is that internal annotation for the clinic machinery?.

-- 
Jesús Cea Avión _/_/  _/_/_/_/_/_/
[email protected] - http://www.jcea.es/ _/_/_/_/  _/_/_/_/  _/_/
Twitter: @jcea_/_/_/_/  _/_/_/_/_/
jabber / xmpp:[email protected]  _/_/  _/_/_/_/  _/_/  _/_/
"Things are not so easy"  _/_/  _/_/_/_/  _/_/_/_/  _/_/
"My name is Dump, Core Dump"   _/_/_/_/_/_/  _/_/  _/_/
"El amor es poner tu felicidad en la felicidad de otro" - Leibniz



signature.asc
Description: OpenPGP digital signature
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Strange "help(int.__lt__)". Probably documentation bug

2014-11-27 Thread Nick Coghlan
On 27 November 2014 at 23:43, Jesus Cea  wrote:
> On 27/11/14 13:42, Victor Stinner wrote:
>> 2014-11-27 13:41 GMT+01:00 Victor Stinner :
>>> I am amused about the "/)" suffix in the signature. It happens to all
>>> magic methods.
>>
>> If I remember correctly, it means  that the function does not accept 
>> keywords:
>
> I don't understand. Is that internal annotation for the clinic machinery?.

See PEP 457 for the broader context: https://www.python.org/dev/peps/pep-0457/

The migration of pydoc (and other introspection APIs) to
inspect.signature in Python 3.4 entailed having an unambiguous string
representation of positional only parameters - that's the trailing '/'
(which mirrors the corresponding syntax in the Argument Clinic DSL).

Cheers,
Nick.

-- 
Nick Coghlan   |   [email protected]   |   Brisbane, Australia
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Please reconsider PEP 479.

2014-11-27 Thread Guido van Rossum
On Thu, Nov 27, 2014 at 3:04 AM, Nick Coghlan  wrote:

> On 27 November 2014 at 11:15, Guido van Rossum  wrote:
> > On Wed, Nov 26, 2014 at 2:53 PM, Nick Coghlan 
> wrote:
> >>
> >> On 27 Nov 2014 06:35, "Guido van Rossum"  wrote:
> >>
> >> [...]
> >>
> >> > I think we can put a number to "much faster" now -- 150 nsec per
> >> > try/except.
> >> >
> >> > I have serious misgivings about that decorator though -- I'm not sure
> >> > how viable it is to pass a flag from the function object to the
> execution
> >> > (which takes the code object, which is immutable) and how other Python
> >> > implementations would do that. But I'm sure it can be done through
> sheer
> >> > willpower. I'd call it the @hettinger decorator in honor of the PEP's
> most
> >> > eloquent detractor. :-)
> >>
> >> I agree with everything you wrote in your reply, so I'll just elaborate
> a
> >> bit on my proposed implementation for the decorator idea.
> >
> > This remark is  ambiguous -- how strongly do you feel that this decorator
> > should be provided? (If so, it should be in the PEP.)
>
> I think it makes sense to standardise it, but something like
> "itertools.allow_implicit_stop" would probably be better than having
> it as a builtin. (The only reason I suggested a builtin initially is
> because putting it in itertools didn't occur to me until later)
>
> Including the decorator provides a straightforward way to immediately
> start writing forward compatible code that's explicit about the fact
> it relies on the current StopIteration handling, without being
> excessively noisy relative to the status quo:
>
> # In a module with a generator that relies on the current behaviour
> from itertools import allow_implicit_stop
>
> @allow_implicit_stop
> def my_generator():
> ...
> yield next(itr)
> ...
>
> In terms of code diffs to ensure forward compatibility, it's 1 import
> statement per affected module, and 1 decorator line per affected
> generator, rather than at least 3 lines (for try/except/return) plus
> indentation changes for each affected generator. That's a useful
> benefit when it comes to minimising the impact on version control code
> annotation, etc.
>
> If compatibility with older Python versions is needed, then you could
> put something like the following in a compatibility module:
>
> try:
> from itertools import allow_implicit_stop
> except ImportError:
> # Allowing implicit stops is the default in older versions
> def allow_implicit_stop(g):
> return g
>

I understand that @allow_import_stop represents a compromise, an attempt at
calming the waves that PEP 479 has caused. But I still want to push back
pretty hard on this idea.

- It means we're forever stuck with two possible semantics for
StopIteration raised in generators.

- It complicates the implementation, because (presumably) a generator
marked with @allow_stop_import should not cause a warning when a
StopIteration bubbles out -- so we actually need another flag to silence
the warning.

- I don't actually know whether other Python implementations have the
ability to copy code objects to change flags.

- It actually introduces a new incompatibility, that has to be solved in
every module that wants to use it (as you show above), whereas just putting
try/except around unguarded next() calls is fully backwards compatible.

- Its existence encourage people to use the decorator in favor of fixing
their code properly.

- The decorator is so subtle that it probably needs to be explained to
everyone who encounters it (and wasn't involved in this PEP discussion).
Because of this I would strongly advise against using it to "fix" the
itertools examples in the docs; it's just too magical. (IIRC only 2
examples actually depend on this.)

Let me also present another (minor) argument for PEP 479. Sometimes you
want to take a piece of code presented as a generator and turn it into
something else. You can usually do this pretty easily by e.g. replacing
every "yield" by a call to print() or list.append(). But if there are any
bare next() calls in the code you have to beware of those. If the code was
originally written without relying on bare next(), the transformation would
have been easier.

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] PEP 479 and asyncio

2014-11-27 Thread Victor Stinner
Hi,

I'm trying to follow the discussion about the PEP 479 (Change
StopIteration handling inside generators), but it's hard to read all
messages. I'm concerned by trollius and asyncio which heavily rely on
StopIteration.

Trollius currently supports running asyncio coroutines: a trollius
coroutine can executes an asyncio coroutine, and and asyncio coroutine
can execute a trollius coroutine.

I modified the Return class of Trollius to not inherit from
StopIteration. All trollius tests pass on Python 3.3 except on one
(which makes me happy, the test suite is wide enough to detect bugs
;-)): test_trollius_in_asyncio.

This specific test executes an asyncio which executes a trollius coroutine.
https://bitbucket.org/enovance/trollius/src/873d21ac0badec36835ed24d13e2aeda24f2dc64/tests/test_asyncio.py?at=trollius#cl-60

The problem is that an asyncio coroutine cannot execute a Trollius
coroutine anymore: "yield from coro" raises a Return exception instead
of simply "stopping" the generator and return the result (value passed
to Return).

I don't see how an asyncio coroutine calling "yield from
trollius_coroutine" can handle the Return exception if it doesn't
inherit from StopIteration. It means that I have to drop this feature
in Python 3.5 (or later when the PEP 479 becomes effective)?

I'm talking about the current behaviour of Python 3.3, I didn't try
the PEP 479 (I don't know if an exception exists).

Victor
class Return(Exception):
def __init__(self, value):
self.value = value

class Task:
def __init__(self, coro):
self.coro = coro
self.result = None
self.done = False

def _step(self):
try:
result = next(self.coro)
except Return as exc:
result = exc.value
self.done = True

def __iter__(self):
while not self.done:
yield self._step()
return self.result

def trollius_coro(calls):
calls.append("enter trollius_coro")
yield None
calls.append("exit trollius_coro with Return")
raise Return(5)

def asyncio_coro(calls):
calls.append("enter asyncio_coro")
coro = trollius_coro(calls)
calls.append("asyncio_coro yield from trollius_coro")
result = yield from coro
calls.append("asyncio_coro returns %r" % result)
return result

def test():
calls = []
coro = asyncio_coro(calls)

# simulate a call to loop.run_until_complete(coro)
task = Task(coro)
result = yield from task

for call in calls:
print(call)
print("Result: %r" % result)

for item in test():
pass
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 479 and asyncio

2014-11-27 Thread Guido van Rossum
On Thu, Nov 27, 2014 at 10:08 AM, Victor Stinner 
wrote:

> I'm trying to follow the discussion about the PEP 479 (Change
> StopIteration handling inside generators), but it's hard to read all
> messages. I'm concerned by trollius and asyncio which heavily rely on
> StopIteration.
>
> Trollius currently supports running asyncio coroutines: a trollius
> coroutine can executes an asyncio coroutine, and and asyncio coroutine
> can execute a trollius coroutine.
>
> I modified the Return class of Trollius to not inherit from
> StopIteration. All trollius tests pass on Python 3.3 except on one
> (which makes me happy, the test suite is wide enough to detect bugs
> ;-)): test_trollius_in_asyncio.
>
> This specific test executes an asyncio which executes a trollius coroutine.
>
> https://bitbucket.org/enovance/trollius/src/873d21ac0badec36835ed24d13e2aeda24f2dc64/tests/test_asyncio.py?at=trollius#cl-60
>
> The problem is that an asyncio coroutine cannot execute a Trollius
> coroutine anymore: "yield from coro" raises a Return exception instead
> of simply "stopping" the generator and return the result (value passed
> to Return).
>
> I don't see how an asyncio coroutine calling "yield from
> trollius_coroutine" can handle the Return exception if it doesn't
> inherit from StopIteration. It means that I have to drop this feature
> in Python 3.5 (or later when the PEP 479 becomes effective)?
>
> I'm talking about the current behaviour of Python 3.3, I didn't try
> the PEP 479 (I don't know if an exception exists).
>

The issue here is that asyncio only interprets StopIteration as returning
from the generator (with a possible value), while a Trollius coroutine must
use "raise Return()" to specify a return value; this works as long
as Return is a subclass of StopIteration, but PEP 479 will break this by
replacing the StopIteration with RuntimeError.

It's an interesting puzzle.

The only way out I can think of is to have asyncio special-case the Return
exception -- we could do that by defining a new exception (e.g.
AlternateReturn) in asyncio that gets treated the same way as
StopIteration, so that Trollius can inherit from AlternateReturn (if it
exists).

What do you think?

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 479 and asyncio

2014-11-27 Thread Victor Stinner
2014-11-27 20:06 GMT+01:00 Guido van Rossum :
> The issue here is that asyncio only interprets StopIteration as returning
> from the generator (with a possible value),

I'm not sure that the issue is directly related to asyncio.

trollius_coro() raises a StopIteration to return the result to caller.
To caller is "result = yield from coro", it's not the complex
Task._step() method. So it's pure Python, except if I missed
something.

> The only way out I can think of is to have asyncio special-case the Return
> exception -- we could do that by defining a new exception (e.g.
> AlternateReturn) in asyncio that gets treated the same way as StopIteration,
> so that Trollius can inherit from AlternateReturn (if it exists).

I don't see how it would work.

Here is a simplified example of my issue. You need to modify all
"yield from coro" to write instead "yield from catch_return(coro)", or
I missed something important.
---
PEP479 = True
if not PEP479:
# trollius: no need for catch_return() before the PEP 479
class Return(StopIteration):
pass
else:
# PEP 479: need catch_return()
class Return(Exception):
def __init__(self, value):
self.value = value

def return_value(value):
if 0:
yield
raise Return(value)

def catch_return(gen):
try:
value = (yield from gen)
except Return as exc:
return exc.value

def add_one(gen):
value = (yield from gen)
return value + 1

def consume_generator(gen):
while True:
try:
next(gen)
except StopIteration as exc:
return exc.value

gen1 = return_value(3)
if PEP479:
gen1 = catch_return(gen1)
gen2 = add_one(gen1)
print(consume_generator(gen2))
---

Victor
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 479 and asyncio

2014-11-27 Thread Victor Stinner
2014-11-27 22:54 GMT+01:00 Victor Stinner :
> I don't see how it would work.

If it cannot be fixed, would it make sense to allow trollius to
continue to work as it currently works with something like "from
__past__ import generator_dont_stop"?

When I talked with a friend about the transition from Python 2 to
Python 3, he asked me why there was not "from __past__ import
division". He wants to add this to his code to not have to worry that
a division may fail "somewhere" in his code.

Maybe it would ease upgrades to newer versions of Python if we
consider keeping the old behaviour for people who don't have time to
port their old code (for no immediate benefit), but need to upgrade
because newer OS only provide newer version of Python.

(What is the cost of keeping the old behaviour: maintain the code and
runtime overhead?)

Victor
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 479 and asyncio

2014-11-27 Thread Chris Angelico
On Fri, Nov 28, 2014 at 8:54 AM, Victor Stinner
 wrote:
> def return_value(value):
> if 0:
> yield
> raise Return(value)

This is one known significant backward-incompatibility issue with this
PEP - it'll be difficult to make this work on Python 2.7, where
"return value" would be a problem, and 3.7, where "raise
StopIteration" would be a problem. At present, I don't know of a
solution to this. In 3.x-only code, you could simply use 'return
value' directly; in 2.7 code, StopIteration doesn't seem to even
*have* a .value attribute (and .args[0] has to be used instead).

But I don't like the idea of a "from __past__" directive. It means
backward-compatibility code has to be maintained through eternity
(otherwise it just shifts the problem to "why did you remove my
__past__ directive, I want a from __paster__ import division"), which
means both the Python implementation code (every Python, not just
CPython) needs to cope, *and* everyone who reads Python code needs to
cope. For python-list, Stack Overflow, and other such coding help
places, this means more questions to ask about a piece of code. For
real-world usage, it means scanning back up to the top of the file
every time you read something that's been affected by a __past__
directive.

Plus, which __future__ directives need __past__ equivalents?
Personally, I wouldn't bother making "from __past__ import
lack_of_with_statement", but your friend is wanting "division", and
I'm sure "print_statement" would be wanted... and, this is the one
that'd split everyone and put the sides to war: "bytes_literals".
Personally, I would want python-dev to say "There will NEVER be a from
__past__ import bytes_literals directive", but there are going to be
others who say "But my code would be so much cleaner AND faster if you
do!", and IMO this is a good reason to avoid having any __past__
directives at all.

ChrisA
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Please reconsider PEP 479.

2014-11-27 Thread Nick Coghlan
On 28 November 2014 at 02:52, Guido van Rossum  wrote:
> On Thu, Nov 27, 2014 at 3:04 AM, Nick Coghlan  wrote:
>> If compatibility with older Python versions is needed, then you could
>> put something like the following in a compatibility module:
>>
>> try:
>> from itertools import allow_implicit_stop
>> except ImportError:
>> # Allowing implicit stops is the default in older versions
>> def allow_implicit_stop(g):
>> return g
>
>
> I understand that @allow_import_stop represents a compromise, an attempt at
> calming the waves that PEP 479 has caused. But I still want to push back
> pretty hard on this idea.
>
> - It means we're forever stuck with two possible semantics for StopIteration
> raised in generators.
>
> - It complicates the implementation, because (presumably) a generator marked
> with @allow_stop_import should not cause a warning when a StopIteration
> bubbles out -- so we actually need another flag to silence the warning.

Ugh, you're right. I'd missed that :(

> - I don't actually know whether other Python implementations have the
> ability to copy code objects to change flags.

I was originally thinking that implicitly catching the RuntimeError
and converting it back to StopIteration could be an acceptable "worst
case" implementation, but I subsequently realised that interacts
differently with yield from than the status quo does.

> - It actually introduces a new incompatibility, that has to be solved in
> every module that wants to use it (as you show above), whereas just putting
> try/except around unguarded next() calls is fully backwards compatible.
>
> - Its existence encourage people to use the decorator in favor of fixing
> their code properly.
>
> - The decorator is so subtle that it probably needs to be explained to
> everyone who encounters it (and wasn't involved in this PEP discussion).
> Because of this I would strongly advise against using it to "fix" the
> itertools examples in the docs; it's just too magical. (IIRC only 2 examples
> actually depend on this.)

Yeah, if not for the status quo, there's no way I'd have suggested it
at all. As it is, you've persuaded me that preserving this capability
indefinitely at the eval loop level isn't worth the extra complexity
(in particular, I'd missed the "add yet another flag to suppress the
warning" issue).

So now I'm wondering if the peephole optimiser could be updated to
pick up the "except -> return" idiom...

> Let me also present another (minor) argument for PEP 479. Sometimes you want
> to take a piece of code presented as a generator and turn it into something
> else. You can usually do this pretty easily by e.g. replacing every "yield"
> by a call to print() or list.append(). But if there are any bare next()
> calls in the code you have to beware of those. If the code was originally
> written without relying on bare next(), the transformation would have been
> easier.

+1

The scenario you describe there strikes me as the statement level
equivalent of the behavioural discrepancies between calling next() in
a generator expression vs doing it in any other kind of comprehension.
In the function definition case, once the "yield" is removed from
elsewhere in the function (so its no longer a generator), it changes
the semantics of any unguarded next() calls.

That's the kind of side effect that's pretty easy for both automated
testing and code review to miss.

Regards,
Nick.

-- 
Nick Coghlan   |   [email protected]   |   Brisbane, Australia
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 479 and asyncio

2014-11-27 Thread Nick Coghlan
On 28 November 2014 at 08:09, Victor Stinner  wrote:
> 2014-11-27 22:54 GMT+01:00 Victor Stinner :
>> I don't see how it would work.
>
> If it cannot be fixed, would it make sense to allow trollius to
> continue to work as it currently works with something like "from
> __past__ import generator_dont_stop"?

I think between contextlib and Trollius, the case is starting to be
made for raising an UnhandledStopIteration subclass of RuntimeError,
rather than a generic RuntimeError. We have at least two known cases
now where code that works with generators-as-coroutines has a valid
reason for wanting to distinguish "arbitrary runtime error" from
"unhandled StopIteration exception". While catching RuntimeError and
looking for StopIteration in __cause__ *works*, it feels messier and
harder to explain than just naming the concept by giving it a
dedicated exception type.

Trollius would still need an adapter to be called from asyncio,
though. Something like:

def implicit_stop(g):
try:
yield from g
except UnhandledStopIteration as exc:
return exc.__cause__.value

Then Victor's example would become:

class Return(StopIteration):
pass

def return_value(value):
if 0:
yield
raise Return(value)

def add_one(gen):
value = (yield from gen)
return value + 1

def consume_generator(gen):
while True:
try:
next(gen)
except StopIteration as exc:
return exc.value

gen1 = return_value(3)
if PEP479:
gen1 = implicit_stop(gen1)
gen2 = add_one(gen1)
print(consume_generator(gen2))

> When I talked with a friend about the transition from Python 2 to
> Python 3, he asked me why there was not "from __past__ import
> division". He wants to add this to his code to not have to worry that
> a division may fail "somewhere" in his code.
>
> Maybe it would ease upgrades to newer versions of Python if we
> consider keeping the old behaviour for people who don't have time to
> port their old code (for no immediate benefit), but need to upgrade
> because newer OS only provide newer version of Python.
>
> (What is the cost of keeping the old behaviour: maintain the code and
> runtime overhead?)

The main problem with *never* deprecating anything is an
ever-increasing cognitive burden in learning the language, as well as
losing the ability to read code in isolation without knowing what
flags are in effect.

Currently, folks that only work in Python 3 don't need to know how
division worked in Python 2, or that print was ever a statement, etc.
If those old behaviours could be selectively turned back on, then
everyone would still need to learn them, and you couldn't review code
in isolation any more: there may be a __past__ import at the top of
the module making it do something different.

If organisations really want to let their code bitrot (and stay on the
treadmill of big expensive high risk updates every decade or so), they
can, but they have to do it by running on old versions of Python as
well - that gives maintainers a clear understanding that if they want
to understand the code, they have to know how Python X.Y worked,
rather than being able to assume modern Python.

Regards,
Nick.

-- 
Nick Coghlan   |   [email protected]   |   Brisbane, Australia
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com