Re: [Python-Dev] PEP 492 vs. PEP 3152, new round

2015-04-30 Thread Greg Ewing

Yury Selivanov wrote:

Sorry, but I'm not sure where & when I had any troubles
predicting the consequences..


You said you wanted 'await a() * b()' to be a syntax
error, but your grammar allows it.

--
Greg
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492: No new syntax is required

2015-04-30 Thread Paul Sokolovsky
Hello,

On Mon, 27 Apr 2015 08:48:49 +0100
Mark Shannon  wrote:

> 
> 
> On 27/04/15 00:13, Guido van Rossum wrote:
> > But new syntax is the whole point of the PEP. I want to be able to
> > *syntactically* tell where the suspension points are in coroutines.
> Doesn't "yield from" already do that?
> 
> > Currently this means looking for yield [from]; PEP 492 just adds
> > looking for await and async [for|with]. Making await() a function
> > defeats the purpose because now aliasing can hide its presence, and
> > we're back in the land of gevent or stackless (where *anything* can
> > potentially suspend the current task). I don't want to live in that
> > land.
>
> I don't think I was clear enough. I said that "await" *is* a
> function, not that is should be disguised as one.

Yes, you said, but it is not. I guess other folks left figuring that
out for yourself, and it's worthy exercise. Hint: await appears to
translate to GET_AWAITABLE and YIELD_FROM opcodes. If your next reply
is "I told you so", then you again miss that "await" is a special
Python language construct (effectively, operator), while the fact that
its implemented as GET_AWAITABLE and YIELD_FROM opcodes in CPython is
only CPython's implementation detail, CPython being just one (random)
Python language implementation.

> Reading the code, 
> "GetAwaitableIter" would be a better name for that element of the 
> implementation. It is a straightforward non-blocking function.

Based on all this passage, my guess is that you miss difference
between C and Python functions. On C level, there're only functions
used to implement everything (C doesn't offer anything else). But on
Python level, there're larger variety: functions, methods, special forms
(a term with a bow to Scheme - it's a function which you can't implement
in terms of other functions and which may have behavior they can't
have). "await" is a special form. The fact that it's implemented by a C
function (or not exactly, as pointed above) is just CPython's
implementation detail. Arguing that "await" should be something based
on what you saw in C code is putting it all backwards.  


-- 
Best regards,
 Paul  mailto:[email protected]
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] A macro for easier rich comparisons

2015-04-30 Thread Petr Viktorin
On Tue, Apr 28, 2015 at 11:13 AM, Victor Stinner
 wrote:
> Hi,
>
> 2015-04-27 16:02 GMT+02:00 Petr Viktorin :
>> A macro like this would reduce boilerplate in stdlib and third-party C
>> extensions. It would ease porting C extensions to Python 3, where rich
>> comparison is mandatory.
>
> It would be nice to have a six module for C extensions. I'm quite sure
> that many projects are already full of #ifdef PYTHON3 ... #else ...
> #endif macros.

The idea actually came from my work on such a library:
http://py3c.readthedocs.org/en/latest/

>> #define Py_RETURN_RICHCOMPARE(val1, val2, op)   \
>> do {\
>> switch (op) {   \
>> case Py_EQ: if ((val1) == (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE;  \
>> case Py_NE: if ((val1) != (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE;  \
>> case Py_LT: if ((val1) < (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE;   \
>> case Py_GT: if ((val1) > (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE;   \
>> case Py_LE: if ((val1) <= (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE;  \
>> case Py_GE: if ((val1) >= (val2)) Py_RETURN_TRUE; Py_RETURN_FALSE;  \
>> }   \
>> Py_RETURN_NOTIMPLEMENTED;   \
>> } while (0)
>
> I would prefer a function for that:
>
> PyObject *Py_RichCompare(long val1, long2, int op);

The original version of the macro used ternary statements. This was
shot down because a chain of comparisons would be slower than a case
statement. (See discussion on the issue.) Wouldn't a function call
also be slower?
Also, a function with long arguments won't work on unsigned long or long long.

> You should also handle invalid operator. PyUnicode_RichCompare() calls
> PyErr_BadArgument() in this case.

There are many different precedents, from ignoring this case to doing
an assert. Is PyErr_BadArgument() better than returning
NotImplemented?

> Anyway, please open an issue for this idea.

http://bugs.python.org/issue23699
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] A macro for easier rich comparisons

2015-04-30 Thread Petr Viktorin
On Tue, Apr 28, 2015 at 4:59 PM, Barry Warsaw  wrote:
> On Apr 28, 2015, at 11:13 AM, Victor Stinner wrote:
>
>>It would be nice to have a six module for C extensions. I'm quite sure
>>that many projects are already full of #ifdef PYTHON3 ... #else ...
>>#endif macros.
>
> Maybe encapsulating some of the recommendations here:
>
> https://wiki.python.org/moin/PortingToPy3k/BilingualQuickRef#Python_extension_modules

py3c (or its documentation) now has all that except REPRV (with an
alias for the native string type, e.g. "PyStr", you can use that in
all reprs, so REPRV strikes me as somewhat redundant).

> (We really need to collect all this information in on place.)
>
>>> #define Py_RETURN_RICHCOMPARE(val1, val2, op)
>
> I think this macro would make a nice addition to the C API.  It might read
> better as `Py_RETURN_RICHCOMPARE(val1, op, val2)`.

(val1, val2, op) mirrors richcmp and PyObject_RichCompareBool; I think
a different order of arguments would just be confusing.
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Issues with PEP 482 (1)

2015-04-30 Thread Paul Sokolovsky
Hello,

On Tue, 28 Apr 2015 19:44:53 +0100
Mark Shannon  wrote:

[]

> A coroutine without a yield statement can be defined simply and 
> concisely, thus:
> 
> @coroutine
> def f():
>  return 1

[]

> A pure-python definition of the "coroutine" decorator is
> given below.
> 

[]

> from types import FunctionType, CodeType
> 
> CO_COROUTINE = 0x0080
> CO_GENERATOR = 0x0020
> 
> def coroutine(f):
>  'Converts a function to a generator function'
>  old_code = f.__code__
>  new_code = CodeType(
>  old_code.co_argcount,
>  old_code.co_kwonlyargcount,


This is joke right? This code has nothing to do with *Python*.
This code deals with internal implementation details of *CPython*. No
other Python implementation would have anything like that (because then
it would be just another CPython, and there's clearly no need to have
two or more CPythons). The code above is as helpful as saying "you
can write some magic values at some magic memory addressed to solve any
problem you ever have".

All that is rather far away from making coroutine writing in Python
easier and less error-prone, which is the topic of PEP482.


-- 
Best regards,
 Paul  mailto:[email protected]
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492: No new syntax is required

2015-04-30 Thread Paul Sokolovsky
Hello,

On Tue, 28 Apr 2015 20:59:18 +0100
Mark Shannon  wrote:

> 
> 
> On 28/04/15 20:24, Paul Sokolovsky wrote:
> > Hello,
> >
> [snip]
> 
> > Based on all this passage, my guess is that you miss difference
> > between C and Python functions.
> This is rather patronising, almost to the point of being insulting.
> Please keep the debate civil.

And yet people do make mistakes and misunderstand, and someone should
bother with social psychology of programming - why this happens, what
are typical patterns, root causes, etc. I don't think you should be
insulted, especially if you think you're right with your points - then
all counter arguments will either help you understand another side, or
will just look funny.

> 
> [snip]
> 
> Cheers,
> Mark.



-- 
Best regards,
 Paul  mailto:[email protected]
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Unicode literals in Python 2.7

2015-04-30 Thread Alexander Walters

does this not work for you?

from __future__ import unicode_literals


On 4/28/2015 16:20, Adam Bartoš wrote:

Hello,

is it possible to somehow tell Python 2.7 to compile a code entered in 
the interactive session with the flag PyCF_SOURCE_IS_UTF8 set? I'm 
considering adding support for Python 2 in my package 
(https://github.com/Drekin/win-unicode-console) and I have run into 
the fact that when u"α" is entered in the interactive session, it 
results in u"\xce\xb1" rather than u"\u03b1". As this seems to be a 
highly specialized question, I'm asking it here.


Regards, Drekin


___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/tritium-list%40sdamon.com


___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Issues with PEP 482 (1)

2015-04-30 Thread Paul Sokolovsky
Hello,

On Tue, 28 Apr 2015 21:00:17 +0100
Mark Shannon  wrote:

[]

> >> CO_COROUTINE = 0x0080
> >> CO_GENERATOR = 0x0020
> >>
> >> def coroutine(f):
> >>   'Converts a function to a generator function'
> >>   old_code = f.__code__
> >>   new_code = CodeType(
> >>   old_code.co_argcount,
> >>   old_code.co_kwonlyargcount,
> >
> >
> > This is joke right?
> Well it was partly for entertainment value, although it works on PyPy.
> 
> The point is that something that can be done with a decorator,
> whether in pure Python or as builtin, does not require new syntax.

And that's exactly not what Python is and not how it evolves. Unlike
Scheme, it doesn't offer some minimal orthogonal basis out of which
everything can be derived by functional application. Instead, it's more
pragmatic and offers plethora of (well defined, unlike many other
languages) concepts and implementations to choose from. And if so
happens that practice shows that some concepts needs "slight"
redefinition, such concept is defined as first-class, despite the fact
that it matches 90% semantics of another concept. Fortunately, on
implementation level, those 90% of semantics are shared, so it is not
outright "bloat".

The current wishful thinking of this PEP is that more people will know
and use "await", while "yield from" will keep being understood and used
by quite not every Python programmer.

(Just to state the obvious, all the above is actually my own
trying to grasp it, and is reverse causation - trying to explain Python
progress in terms of how this particular PEP482 and its older friends
progress, it may be quite different for other aspects of language. I
for one rather surprised that BDFL is so positive about this PEP). 

> 
> Cheers,
> Mark.
> 



-- 
Best regards,
 Paul  mailto:[email protected]
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Issues with PEP 482

2015-04-30 Thread Elvis Pranskevichus
Hi Mark,


Mark Shannon wrote:

> Hi,
> 
> I still think that there are several issues that need addressing with
> PEP 492. This time, one issue at a time :)
> 
> "async"
> 
> The "Rationale and Goals" of PEP 492 states that PEP 380 has 3
> shortcomings. The second of which is:
>  """It is not possible to natively define a coroutine which has no
> yield or yield from statements."""
> This is incorrect, although what is meant by 'natively' is unclear.
> 
> A coroutine without a yield statement can be defined simply and
> concisely, thus:
> 
> @coroutine
> def f():
>  return 1
> 
> This is only a few character longer than the proposed new syntax,
> perfectly explicit and requires no modification the language whatsoever.
> A pure-python definition of the "coroutine" decorator is given below.
> 
> So could the "Rationale and Goals" be correctly accordingly, please.
> Also, either the "async def" syntax should be dropped, or a new
> justification is required.
> 
> Cheers,
> Mark.
> 

As was previously mentioned, new async syntax is a major point of PEP 492. 

Coroutine-based concurrent programming is something that a lot of languages 
and platforms are adopting as a first class feature.  Just look at the list 
of references in the PEP.

While the specific coroutine definition point you are arguing about can 
certainly be debated about, you seem to disregard the other points PEP 492 
is raising, which boil down to making coroutines a first class object in 
Python, with all the robustness and support that implies.   Decorators in 
Python are an auxiliary feature that has nothing to do with core language 
semantics.

Also, "async for" and "async with" are just as important in concurrent 
programs, as regular "for" and "with" are in sequential programs.  Saying 
"no we don't need new syntax", when lots of folks think we do, is just a 
contradiction without real argument.


  Elvis

___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 vs. PEP 3152, new round

2015-04-30 Thread Isaac Schwabacher
On 15-04-29, Yury Selivanov  wrote:
> Hi Ethan,
> 
> On 2015-04-29 2:32 PM, Ethan Furman wrote:
> >On 04/29, Yury Selivanov wrote:
> >>On 2015-04-29 1:25 PM, Ethan Furman wrote:
> >>>cannot also just work and be the same as the parenthesized
> >>>version.
> >>Because it does not make any sense.
> >I obviously don't understand your position that "it does not make
> >any sense" -- perhaps you could explain a bit?
> >
> >What I see is a suspension point that is waiting for the results of
> >coro(), which will be negated (and returned/assigned/whatever).
> >What part of that doesn't make sense?
> >
> 
> Because you want operators to be resolved in the
> order you see them, generally.
> 
> You want '(await -fut)' to:
> 
> 1. Suspend on fut;
> 2. Get the result;
> 3. Negate it.
> 
> This is a non-obvious thing. I would myself interpret it
> as:
> 
> 1. Get fut.__neg__();
> 2. await on it.
> 
> So I want to make this syntactically incorrect:

Does this need to be a syntax error? -"hello" raises TypeError because str 
doesn't have a __neg__, but there's no reason a str subclass couldn't define 
one. "TypeError: bad operand type for unary -: 'asyncio.Future'" is enough to 
clear up any misunderstandings, and if someone approaching a new language 
construct doesn't test their code well enough to at least execute all the code 
paths, the difference between a compile-time SyntaxError and a run-time 
TypeError is not going to save them.

ijs

> 'await -fut' would throw a SyntaxError. To do what you
> want, write a pythonic '- await fut'.
> 
> 
> Yury
> 
> ___
> Python-Dev mailing list
> [email protected]
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe: 
> https://mail.python.org/mailman/options/python-dev/ischwabacher%40wisc.edu
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 vs. PEP 3152, new round

2015-04-30 Thread Isaac Schwabacher
On 15-04-29, Yury Selivanov  wrote:
> 
> 
> On 2015-04-29 3:24 PM, Isaac Schwabacher wrote:
> >On 15-04-29, Yury Selivanov wrote:
> >>Hi Ethan,
> [..]
> >>So I want to make this syntactically incorrect:
> >Does this need to be a syntax error? -"hello" raises TypeError because str 
> >doesn't have a __neg__, but there's no reason a str subclass couldn't define 
> >one. "TypeError: bad operand type for unary -: 'asyncio.Future'" is enough 
> >to clear up any misunderstandings, and if someone approaching a new language 
> >construct doesn't test their code well enough to at least execute all the 
> >code paths, the difference between a compile-time SyntaxError and a run-time 
> >TypeError is not going to save them.
> 
> The grammar of the language should match the most common
> use case.
> 
> FWIW, I've just updated the pep with a precedence table:
> https://hg.python.org/peps/rev/d355918bc0d7

I'd say the grammar of the language should be the least surprising overall, 
which definitely means it should be clean in the most common case but doesn't 
mean it has to go out of its way to make other cases difficult. But reading 
that precedence table makes everything clear-- the proper comparison isn't 
(-"hello"), it's (-not False), which *is* a syntax error. Silly prefix 
operators.

ijs
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492: What is the real goal?

2015-04-30 Thread Paul Sokolovsky
Hello,

On Wed, 29 Apr 2015 20:19:40 +0100
Paul Moore  wrote:

[]

> Thanks for that. That does look pretty OK. One question, though - it
> uses an asyncio Queue. The original code would work just as well with
> a list, or more accurately, something that wasn't designed for async
> use. So the translation isn't completely equivalent. Also, can I run
> the produce/consume just by calling produce()? My impression is that
> with asyncio I need an event loop - which "traditional" coroutines
> don't need. Nevertheless, the details aren't so important, it was only
> a toy example anyway.

All this confusion stems from the fact that wikipedia article fails to
clearly provide classification dichotomies for coroutines. I suggest
reading Lua coroutine description as much better attempt at
classification: http://www.lua.org/pil/9.1.html . It for example
explicit at mentioning common pitfall: "Some people call asymmetric
coroutine semi-coroutines (because they are not symmetrical, they are
not really co). However, other people use the same term semi-coroutine
to denote a restricted implementation of coroutines". Comparing that
to wikipedia article, you'll notice that it uses "semicoroutine" in
just one of a sense, and well, different people use "semi" part of
a different classification axis.

So, trying to draw a table from Lua's text, there're following 2 axes:

Axis 1: Symmetric vs Asymmetric

Asymmetric coroutines use 2 control flow constructs, akin to
subroutine call and return. (Names vary, return is usually called
yield.) 

Symmetric use only one. You can think of symmetric coroutines only call
or only return, though more un-confusing term is "switch to".

Axis 2: "Lexical" vs "Dynamic"

Naming less standardized. Lua calls its coroutines "tru", while other
- "generators". Others call them "coroutines" vs "generators". But the
real difference is intuitively akin of lexical vs dynamic scoping.
"Lexical" coroutines require explicit marking of each (including
recursive) call to a coroutine. "Dynamic" do not - you can call a
normally looking function, and it suddenly pass control to somewhere
else (another coroutine), about which fact you don't have a clue.


All *four* recombined types above are coroutines, albeit all with
slightly different properties. Symmetric dynamic coroutines are the
most powerful type - as powerful as an abyss. They are what is usually
used to frighten the innocent. Wikipedia shows you example of them.

No sane real-world language uses symmetric coroutines - they're not
useful without continuations, and sane real-world people don't want
to manage continuations manually. Python, Lua, C# use asymmetric
coroutines.

Python and C# use asymmetric "lexical" coroutines - the simplest, and
thus safest type, but which has limitations wrt to doing mind-boggling
things.

Lua has "dynamic" asymmetric coroutines - more powerful, and thus more
dangerous type (you want to look with jaundiced eye at that guy's
framework based on "dynamic" coroutines - you'd better rewrite it from
scratch before you trust it).



-- 
Best regards,
 Paul  mailto:[email protected]
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 quibble and request

2015-04-30 Thread Paul Sokolovsky
Hello,

On Wed, 29 Apr 2015 20:33:09 -0400
Yury Selivanov  wrote:

> Hi Ethan,
> 
> On 2015-04-29 8:21 PM, Ethan Furman wrote:
> >  From the PEP:
> >
> >> Why not a __future__ import
> >>
> >> __future__ imports are inconvenient and easy to forget to add.
> > That is a horrible rationale for not using an import.  By that
> > logic we should have everything in built-ins.  ;)
> >
> >
> >> Working example
> >> ...
> > The working example only uses async def and await, not async with
> > nor async for nor __aenter__, etc., etc.
> >
> > Could you put in a more complete example -- maybe a basic chat room
> > with both server and client -- that demonstrated more of the new
> > possibilities?
> 
> Andrew Svetlov has implemented some new features in his
> aiomysql driver:
> 
> https://github.com/aio-libs/aiomysql/blob/await/tests/test_async_iter.py
> 
> I don't want to cite it in the PEP because it's not complete
> yet, and some idioms (like 'async with') aren't used to their
> full potential.
> 
> >
> > Having gone through the PEP again, I am still no closer to
> > understanding what happens here:
> >
> >data = await reader.read(8192)
> >
> > What does the flow of control look like at the interpreter level?
> 
> 'await' is semantically equivalent to 'yield from' in this line.
> 
> To really understand all implementation details of this line
> you need to read PEP 3156 and experiment with asyncio. There
> is no easier way, unfortunately.  I can't add a super detailed
> explanation how event loops can be implemented in PEP 492,
> that's not in its scope.
> 
> The good news is that to use asyncio on a daily basis you
> don't need to know all details, as you don't need to know
> how 'ceval.c' works and how 'setTimeout' is implemented in
> JavaScript.

+1

But if you really want, you can. The likely reason for that though
would be desire to develop "yield from" for an alternative Python
implementation. You can take inspiration from a diagram I drew while I
implemented "yield from" for MicroPython:
https://dl.dropboxusercontent.com/u/44884329/yield-from.pdf


-- 
Best regards,
 Paul  mailto:[email protected]
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Clarification of PEP 476 "opting out" section

2015-04-30 Thread M.-A. Lemburg
On 30.04.2015 02:33, Nick Coghlan wrote:
> Hi folks,
> 
> This is just a note to highlight the fact that I tweaked the "Opting
> out" section in PEP 476 based on various discussions I've had over the
> past few months: https://hg.python.org/peps/rev/dfd96ee9d6a8
> 
> The notable changes:
> 
> * the example monkeypatching code handles AttributeError when looking
> up "ssl._create_unverified_context", in order to accommodate older
> versions of Python that don't have PEP 476 implemented
> * new paragraph making it clearer that while the intended use case for
> the monkeypatching trick is as a workaround to handle environments
> where you *know* HTTPS certificate verification won't work properly
> (including explicit references to sitecustomize.py and Standard
> Operating Environments for Python), there's also a secondary use case
> in allowing applications to provide a system administrator controlled
> setting to globally disable certificate verification (hence the change
> to the example code)
> * new paragraph making it explicit that even though we've improved
> Python's default behaviour, particularly security sensitive
> applications should still provide their own context rather than
> relying on the defaults

Can we please make the monkeypatch a regular part of Python's
site.py which can enabled via an environment variable, say
export PYTHONHTTPSVERIFY=0.

See http://bugs.python.org/issue23857 for the discussion.

Esp. for Python 2.7.9 the default verification from PEP 476
is causing problems for admins who want to upgrade their
Python installation without breaking applications using
Python. They need an easy and official non-hackish way to
opt-out from the PEP 476 default on a per application basis.

Thanks,
-- 
Marc-Andre Lemburg
eGenix.com

Professional Python Services directly from the Source  (#1, Apr 30 2015)
>>> Python Projects, Coaching and Consulting ...  http://www.egenix.com/
>>> mxODBC Plone/Zope Database Adapter ...   http://zope.egenix.com/
>>> mxODBC, mxDateTime, mxTextTools ...http://python.egenix.com/


: Try our mxODBC.Connect Python Database Interface for free ! ::

   eGenix.com Software, Skills and Services GmbH  Pastor-Loeh-Str.48
D-40764 Langenfeld, Germany. CEO Dipl.-Math. Marc-Andre Lemburg
   Registered at Amtsgericht Duesseldorf: HRB 46611
   http://www.egenix.com/company/contact/
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492: What is the real goal?

2015-04-30 Thread Paul Moore
On 30 April 2015 at 06:39, Greg Ewing  wrote:
> Aaargh, this is what we get for overloading the word
> "coroutine". The Wikipedia article is talking about a
> technique where coroutines yield control to other
> explicitly identified coroutines.

Yep, I understand that. It's just that that's what I understand by coroutines.

> Coroutines in asyncio don't work that way; instead
> they just suspend themselves, and the event loop
> takes care of deciding which one to run next.

Precisely. As I say, the terminology is probably not going to change
now - no big deal in practice.
Paul
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Unicode literals in Python 2.7

2015-04-30 Thread Stephen J. Turnbull
Chris Angelico writes:

 > It's legal Unicode, but it doesn't mean what he typed in.

Of course, that's obvious.  My point is "Welcome to the wild wacky
world of soi-disant 'internationalized' software, where what you see
is what you get regardless of what you type."



___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 vs. PEP 3152, new round

2015-04-30 Thread Greg Ewing

Nathaniel Smith wrote:

Even if we put aside our trained intuitions about arithmetic, I think
it's correct to say that the way unary minus is parsed is: everything
to the right of it that has a tighter precedence gets collected up and
parsed as an expression, and then it takes that expression as its
argument.


Tighter or equal, actually: '--a' is allowed.

This explains why Yury's syntax disallows 'await -f'.
The 'await' operator requires something after it, but
there's *nothing* between it and the following '-',
which binds less tightly.

So it's understandable, but you have to think a bit
harder.

Why do we have to think harder? I suspect it's because
the notion of precedence is normally introduced to resolve
ambiguities. Knowing that infix '*' has higher precedence
than infix '+' tells us that 'a + b * c' is parsed as
'a + (b * c)' and not '(a + b) * c'.

Similarly, knowing that infix '.' has higher precedence
than prefix '-' tells us that '-a.b' is parsed as
'-(a.b)' rather than '(-a).b'.

However, giving prefix 'await' higher precedence than
prefix '-' doesn't serve to resolve any ambiguity.
'- await f' is parsed as '-(await f)' either way, and
'await f + g' is parsed as '(await f) + g' either way.

So when we see 'await -f', we think we already know
what it means. There is only one possible order for
the operations, so it doesn't look as though precedence
comes into it at all, and we don't consider it when
judging whether it's a valid expression.

What's the conclusion from all this? I think it's
that using precedence purely to disallow certain
constructs, rather than to resolve ambiguities, leads
to a grammar with less-than-intuitive characteristics.

--
Greg
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 quibble and request

2015-04-30 Thread Greg Ewing

Ethan Furman wrote:

Having gone through the PEP again, I am still no closer to understanding
what happens here:

  data = await reader.read(8192)

What does the flow of control look like at the interpreter level?


Are you sure you *really* want to know? For the sake
of sanity, I recommend ignoring the actual control
flow and pretending that it's just like

   data = reader.read(8192)

with the reader.read() method somehow able to be
magically suspended.

--
Greg
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 quibble and request

2015-04-30 Thread Paul Moore
On 30 April 2015 at 02:52, Nick Coghlan  wrote:
> This request isn't about understanding the implementation details,
> it's about understanding what Python *users* will gain from the PEP
> without *needing* to understand the implementation details.
>
> For that, I'd like to see some not-completely-trivial example code in
> (or at least linked from) the PEP written using:
>
> * trollius (no "yield from", Python 2 compatible, akin to Twisted's
> inlineDeferred's)
> * asyncio/tulip ("yield from", Python 3.3+ compatible)
> * PEP 492 (async/await, Python 3.5+ only)
>
> The *intent* of PEP 492, like PEP 380 and PEP 342 before it, is "make
> asynchronous programming in Python easier". I think it will actually
> succeed in that goal, but I don't think it currently does an
> especially job of explaining that to folks that aren't both already
> deeply invested in the explicitly asynchronous programming model *and*
> thoroughly aware of the fact that most of us need asynchronous
> programming to look as much like synchronous programming as possible
> in order for it to fit our brains.

I agree 100% on this. As things stand, it feels as though asyncio
feels frighteningly complex for anyone who isn't deeply involved with
it (it certainly does to me). The current PEP feels to an outsider as
if it is solving problems that are specialist and to the average
(non-asyncio) programmer add language complexity with no real benefit.

It's not specific to PEP 492, but what I would like to see is:

1. More tutorial level examples of how to use asyncio. Specifically
*not* examples of how to write web services, or how to do async web
requests in your existing async program. Instead, how to integrate
asyncio into generally non-async code. For example, looking at pip, I
see a few places where I can anticipate asyncio might be useful - the
link-chasing code, the package download code, and the code to run
setup.py in a subprocess, seem like places where we could do stuff in
an async manner (it's not a big enough deal that we've ever used
threads, so I doubt we'd want to use asyncio either in practice, but
they are certainly the *types* of code I see as benefitting from
async).

2. Following on from this, how do I isolate async code from the rest
of my program (i.e. I don't want to have to rewrite my whole program
around an event loop just to run a bunch of background programs in
parallel)?

3. Clarification on the roles of async/await vs yield
from/generator.send. Are they both useful, and if so in what contexts
(ignoring "if you want to support Python 3.4" compatibility cases)?
How should a programmer choose which is appropriate?

4. A much better explanation of *when* any of the async constructs are
appropriate at all. The name "asyncio" implies IO, and all of the
examples further tend to imply "sockets". So the immediate impression
is that only socket programmers and people writing network protocols
should care.

Of course, if asyncio and the PEP *are* only really relevant to
network protocols, then my impressions are actually correct and I
should drop out of the discussion. But if that's the case, it seems
like a lot of language change for a relatively specialist use case.
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 quibble and request

2015-04-30 Thread Paul Moore
On 30 April 2015 at 09:58, Greg Ewing  wrote:
> Ethan Furman wrote:
>>
>> Having gone through the PEP again, I am still no closer to understanding
>> what happens here:
>>
>>   data = await reader.read(8192)
>>
>> What does the flow of control look like at the interpreter level?
>
>
> Are you sure you *really* want to know? For the sake
> of sanity, I recommend ignoring the actual control
> flow and pretending that it's just like
>
>data = reader.read(8192)
>
> with the reader.read() method somehow able to be
> magically suspended.

Well, if I don't know, I get confused as to where I invoke the event
loop, how my non-async code runs alongside this etc.
Paul
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492: async/await in Python; version 4

2015-04-30 Thread Greg Ewing

Yury Selivanov wrote:


3. CO_NATIVE_COROUTINE flag. This enables us to disable
__iter__ and __next__ on native coroutines while maintaining
full backwards compatibility.


I don't think you can honestly claim "full backwards
compatibility" as long as there are some combinations
of old-style and new-style code that won't work
together. You seem to be using your own personal
definition of "full" here.

--
Greg

___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 quibble and request

2015-04-30 Thread Antoine Pitrou
On Thu, 30 Apr 2015 10:02:17 +0100
Paul Moore  wrote:
> 
> Of course, if asyncio and the PEP *are* only really relevant to
> network protocols, then my impressions are actually correct and I
> should drop out of the discussion. But if that's the case, it seems
> like a lot of language change for a relatively specialist use case.

That's my impression too. There's nothing remotely horrible about
"yield from". Although we should have done the right thing from the
start, this is a lot of language churn to introduce *now*, not to
mention annoyingly similar but incompatible constructs that make the
language more difficult to understand.

So I'm rather -0.5 on the idea.

Regards

Antoine.


___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Clarification of PEP 476 "opting out" section

2015-04-30 Thread Antoine Pitrou
On Thu, 30 Apr 2015 09:59:34 +0200
"M.-A. Lemburg"  wrote:
> 
> Can we please make the monkeypatch a regular part of Python's
> site.py which can enabled via an environment variable, say
> export PYTHONHTTPSVERIFY=0.

-1 (already explained in the bug below).

> See http://bugs.python.org/issue23857 for the discussion.

Regards

Antoine.


___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] What's missing in PEP-484 (Type hints)

2015-04-30 Thread Dima Tisnek
# Syntactic sugar
"Beautiful is better than ugly, " thus nice syntax is needed.
Current syntax is very mechanical.
Syntactic sugar is needed on top of current PEP.


# internal vs external
@intify
def foo() -> int:
b = "42"
return b  # check 1
x = foo() // 2  # check 2

Does the return type apply to implementation (str) or decorated callable (int)?
How can same annotation or a pair of annotations be used to:
* validate return statement type
* validate subsequent use
* look reasonable in the source code


# lambda
Not mentioned in the PEP, omitted for convenience or is there a rationale?
f = lambda x: None if x is None else str(x ** 2)
Current syntax seems to preclude annotation of `x` due to colon.
Current syntax sort of allows lamba return type annotation, but it's
easy to confuse with `f`.


# local variables
Not mentioned in the PEP
Non-trivial code could really use these.


# global variables
Not mentioned in the PEP
Module-level globals are part of API, annotation is welcome.
What is the syntax?


# comprehensions
[3 * x.data for x in foo if "bar" in x.type]
Arguable, perhaps annotation is only needed on `foo` here, but then
how complex comprehensions, e.g. below, the intermediate comprehension
could use an annotation
[xx for y in [...] if ...]


# class attributes
s = socket.socket(...)
s.type, s.family, s.proto  # int
s.fileno  # callable
If annotations are only available for methods, it will lead to
Java-style explicit getters and setters.
Python language and data model prefers properties instead, thus
annotations are needed on attributes.


# plain data
user1 = dict(id=123,  # always int
name="uuu",  # always str
...)  # other fields possible
smth = [42, "xx", ...]
(why not namedtuple? b/c extensible, mutable)
At least one PHP IDE allows to annotate PDO.
Perhaps it's just bad taste in Python? Or is there a valid use-case?


# personal note
I think it's amazing how much thought has already been put into this
proposal. The foundation is pretty solid (per Guido talk). I am not at
all opposed to software that infers types (like jedi), or reads
user-specified types (like phpstorm and pep 484) and does something
good with that. In fact I'm ambivalent to current proposal, standard
and promise of better tools on one hand; narrow scope, debatable looks
on the other.


-- dima
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492: What is the real goal?

2015-04-30 Thread Greg Ewing

Paul Moore wrote:

Also, can I run
the produce/consume just by calling produce()? My impression is that
with asyncio I need an event loop - which "traditional" coroutines
don't need.


The Pythonic way to do things like that is to write
the producer as a generator, and the consumer as a
loop that iterates over it. Or the consumer as a
generator, and the producer as a loop that send()s
things into it.

To do it symmetrically, you would need to write them
both as generators (or async def functions or whatever)
plus a mini event loop to tie the two together.

--
Greg
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] What's missing in PEP-484 (Type hints)

2015-04-30 Thread Steven D'Aprano
On Thu, Apr 30, 2015 at 01:41:53PM +0200, Dima Tisnek wrote:

> # Syntactic sugar
> "Beautiful is better than ugly, " thus nice syntax is needed.
> Current syntax is very mechanical.
> Syntactic sugar is needed on top of current PEP.

I think the annotation syntax is beautiful. It reminds me of Pascal.


> # internal vs external
> @intify
> def foo() -> int:
> b = "42"
> return b  # check 1
> x = foo() // 2  # check 2
> 
> Does the return type apply to implementation (str) or decorated callable 
> (int)?

I would expect that a static type checker would look at foo, and flag 
this as an error. The annotation says that foo returns an int, but it 
clearly returns a string. That's an obvious error.

Here is how I would write that:


# Perhaps typing should have a Function type?
def intify(func: Callable[[], str]) -> Callable[[], int]:
@functools.wraps(func)
def inner() -> int:
return int(func())
return inner


@intify
def foo() -> str:
b = "42"
return b


That should, I hope, pass the type check, and without lying about the 
signature of *undecorated* foo.

The one problem with this is that naive readers will assume that 
*decorated* foo also has a return type of str, and be confused. That's a 
problem. One solution might be, "don't write decorators that change the 
return type", but that seems horribly restrictive. Another solution 
might be to write a comment:

@intify  # changes return type to int
def foo() -> str:
...

but that's duplicating information already in the intify decorator, and 
it relies on the programmer writing a comment, which people don't do 
unless they really need to.

I think that the only solution is education: given a decorator, you 
cannot assume that the annotations still apply unless you know what the 
decorator does.


> How can same annotation or a pair of annotations be used to:
> * validate return statement type
> * validate subsequent use
> * look reasonable in the source code
> 
> 
> # lambda
> Not mentioned in the PEP, omitted for convenience or is there a rationale?
> f = lambda x: None if x is None else str(x ** 2)
> Current syntax seems to preclude annotation of `x` due to colon.
> Current syntax sort of allows lamba return type annotation, but it's
> easy to confuse with `f`.

I don't believe that you can annotate lambda functions with current 
syntax. For many purposes, I do not think that is important: a good type 
checker will often be able to infer the return type of the lambda, and 
from that infer what argument types are permitted:

lambda arg: arg + 1

Obviously arg must be a Number, since it has to support addition with 
ints.


> # local variables
> Not mentioned in the PEP
> Non-trivial code could really use these.

Normally local variables will have their type inferred from the 
operations done to them:

s = arg[1:]  # s has the same type as arg

When that is not satisfactory, you can annotate variables with a comment:

s = arg[1:]  #type: List[int]

https://www.python.org/dev/peps/pep-0484/#id24


> # global variables
> Not mentioned in the PEP
> Module-level globals are part of API, annotation is welcome.
> What is the syntax?

As above.


> # comprehensions
> [3 * x.data for x in foo if "bar" in x.type]
> Arguable, perhaps annotation is only needed on `foo` here, but then
> how complex comprehensions, e.g. below, the intermediate comprehension
> could use an annotation
> [xx for y in [...] if ...]

A list comprehension is obviously of type List. If you need to give a 
more specific hint:

result = [expr for x in things if cond(x)]  #type: List[Whatever]

See also the discussion of "cast" in the PEP.

https://www.python.org/dev/peps/pep-0484/#id25


> # class attributes
> s = socket.socket(...)
> s.type, s.family, s.proto  # int
> s.fileno  # callable
> If annotations are only available for methods, it will lead to
> Java-style explicit getters and setters.
> Python language and data model prefers properties instead, thus
> annotations are needed on attributes.

class Thing:
a = 42  # can be inferred
b = []  # inferred as List[Any]
c = []  #type: List[float]



-- 
Steve
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


[Python-Dev] Postponing making a decision on the future of the development workflow

2015-04-30 Thread Brett Cannon
Real world stuff is devouring my free time since immediately after PyCon
and will continue to do so for probably the next few months. I'm hoping to
find the energy to engage Donald and Nick about their proposals while I'm
time-constrained so that when I do have free time again I will be able to
make a decision quickly.

This also means that I'm allowing Donald and Nick to update their PEPs
while they wait for me, although I'm not taking on any more proposals as
the two current proposals cover the two ranges of suggestions people have
talked to me about on this topic.

My apologies to Nick and Donald for slipping on my own deadline.
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 vs. PEP 3152, new round

2015-04-30 Thread Ethan Furman
On 04/29, Nathaniel Smith wrote:

> (I suspect this may also be the impetus behind Greg's request that it just
> be treated the same as unary minus. IMHO it matters much more that the
> rules be predictable and teachable than that they allow or disallow every
> weird edge case in exactly the right way.)

+1

--
~Ethan~
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Unicode literals in Python 2.7

2015-04-30 Thread Adam Bartoš
> does this not work for you?
>
> from __future__ import unicode_literals

No, with unicode_literals I just don't have to use the u'' prefix, but the
wrong interpretation persists.


On Thu, Apr 30, 2015 at 3:03 AM, Stephen J. Turnbull 
wrote:

>
> IIRC, on the Linux console and in an uxterm, PYTHONIOENCODING=utf-8 in
> the environment does what you want.


Unfortunately, it doesn't work. With PYTHONIOENCODING=utf-8, the sys.std*
streams are created with utf-8 encoding (which doesn't help on Windows
since they still don't use ReadConsoleW and WriteConsoleW to communicate
with the terminal) and after changing the sys.std* streams to the fixed
ones and setting readline hook, it still doesn't work, so presumably the
PyCF_SOURCE_IS_UTF8 is still not set.



> Regarding your environment, the repeated use of "custom" is a red
> flag.  Unless you bundle your whole environment with the code you
> distribute, Python can know nothing about that.  In general, Python
> doesn't know what encoding it is receiving text in.
>

Well, the received text comes from sys.stdin and its encoding is known.
Ideally, Python would recieve the text as Unicode String object so there
would be no problem with encoding (see
http://bugs.python.org/issue17620#msg234439 ).


If you *do* know, you can set PyCF_SOURCE_IS_UTF8.  So if you know
> that all of your users will have your custom stdio and readline hooks
> installed (AFAICS, they can't use IDLE or IPython!), then you can
> bundle Python built with the flag set, or perhaps you can do the
> decoding in your custom stdio module.
>

The custom stdio streams and readline hooks are set at runtime by a code in
sitecustomize. It does not affect IDLE and it is compatible with IPython. I
would like to also set PyCF_SOURCE_IS_UTF8 at runtime from Python e.g. via
ctypes. But this may be impossible.



> Note that even if you have a UTF-8 input source, some users are likely
> to be surprised because IIRC Python doesn't canonicalize in its
> codecs; that is left for higher-level libraries.  Linux UTF-8 is
> usually NFC normalized, while Mac UTF-8 is NFD normalized.
>

Actually, I have a UTF-16-LE source, but that is not important since it's
decoted to Python Unicode string object. I have this Unicode string and I'm
to return it from the readline hook, but I don't know how to communicate it
to the caller – the tokenizer – so it is interpreted correctly. Note that
the following works:

>>> eval(raw_input('~~> '))
~~> u'α'
u'\u03b1'

Unfortunatelly, the REPL works differently than eval/exec on raw_input. It
seems that the only option is to bypass the REPL by a custom REPL (e.g.
based on code.InteractiveConsole). However, wrapping up the execution of a
script, so that the custom REPL is invoked at the right place, is
complicated.


 > >>> Le 29 avr. 2015 10:36, "Adam Bartoš"  a écrit :
>  > >>> > Why I'm talking about PyCF_SOURCE_IS_UTF8? eval(u"u'\u03b1'") ->
>  > >>> u'\u03b1' but eval(u"u'\u03b1'".encode('utf-8')) -> u'\xce\xb1'.
>
> Just to be clear, you accept those results as correct, right?
>

Yes. In the latter case, eval has no idea how the bytes given are encoded.
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492: async/await in Python; version 4

2015-04-30 Thread Yury Selivanov

On 2015-04-30 5:16 AM, Greg Ewing wrote:

Yury Selivanov wrote:


3. CO_NATIVE_COROUTINE flag. This enables us to disable
__iter__ and __next__ on native coroutines while maintaining
full backwards compatibility.


I don't think you can honestly claim "full backwards
compatibility" as long as there are some combinations
of old-style and new-style code that won't work
together. You seem to be using your own personal
definition of "full" here.



Well, using next() and iter() on coroutines in asyncio
code is something esoteric.  I can't even imagine
why you would want to do that.

Yury
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 vs. PEP 3152, new round

2015-04-30 Thread Ethan Furman
On 04/29, Yury Selivanov wrote:

> Because you want operators to be resolved in the
> order you see them, generally.
> 
> You want '(await -fut)' to:
> 
> 1. Suspend on fut;
> 2. Get the result;
> 3. Negate it.
> 
> This is a non-obvious thing. I would myself interpret it
> as:
> 
> 1. Get fut.__neg__();
> 2. await on it.

Both you and Paul are correct on this, thank you.  The proper resolution
of

  await -coro() 

is indeed to get the result of coro(), call it's __neg__ method, and then
await on that.

And that is perfectly reasonable, and should not be a SyntaxError; what it
might be is an AttributeError (no __neg__ method) or an AsyncError (__neg__
returned non-awaitable object), or might even just work [1]... but it
definitely should /not/ be a SyntaxError.

--
~Ethan~

[1] http://stackoverflow.com/q/7719018/208880
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492: What is the real goal?

2015-04-30 Thread Jim J. Jewett
On Wed, Apr 29, 2015 at 2:26 PM, Paul Moore  wrote:
> On 29 April 2015 at 18:43, Jim J. Jewett  wrote:

>> So?  PEP 492 never says what coroutines *are* in a way that explains
>> why it matters that they are different from generators.

...

> Looking at the Wikipedia article on coroutines, I see an example of
> how a producer/consumer process might be written with coroutines:
>
> var q := new queue
>
> coroutine produce
> loop
> while q is not full
> create some new items
> add the items to q
> yield to consume
>
> coroutine consume
> loop
> while q is not empty
> remove some items from q
> use the items
> yield to produce
>
> (To start everything off, you'd just run "produce").
>
> I can't even see how to relate that to PEP 429 syntax. I'm not allowed
> to use "yield", so should I use "await consume" in produce (and vice
> versa)?

I think so ... but the fact that nothing is actually coming via the
await channel makes it awkward.

I also worry that it would end up with an infinite stack depth, unless
the await were actually replaced with some sort of framework-specific
scheduling primitive, or one of them were rewritten differently to
ensure it returned to the other instead of calling it anew.

I suspect the real problem is that the PEP is really only concerned
with a very specific subtype of coroutine, and these don't quite fit.
(Though it could be done by somehow making them both await on the
queue status, instead of on each other.)

-jJ
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 vs. PEP 3152, new round

2015-04-30 Thread Guido van Rossum
On Thu, Apr 30, 2015 at 9:15 AM, Ethan Furman  wrote:

> [...]
> Both you and Paul are correct on this, thank you.  The proper resolution
> of
>
>   await -coro()
>
> is indeed to get the result of coro(), call it's __neg__ method, and then
> await on that.
>
> And that is perfectly reasonable, and should not be a SyntaxError; what it
> might be is an AttributeError (no __neg__ method) or an AsyncError (__neg__
> returned non-awaitable object), or might even just work [1]... but it
> definitely should /not/ be a SyntaxError.
>

Why not? Unlike some other languages, Python does not have uniform
priorities for unary operators, so it's reasonable for some unary
operations to have a different priority than others, and certain things
will be SyntaxErrors because of that. E.g. you can write "not -x" but you
can't write "- not x". This is because they have different priorities:
'not' has a very low priority so it binds less tight than comparisons
(which bind less tight than arithmetic), but '-' has a high priority.
(There are other quirks, e.g. -2**2 means -(2**2) and you can write 2**-2;
but you can't write 2**not x.)

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492: What is the real goal?

2015-04-30 Thread Guido van Rossum
On Thu, Apr 30, 2015 at 10:24 AM, Jim J. Jewett 
wrote:

> I suspect the real problem is that the PEP is really only concerned
> with a very specific subtype of coroutine, and these don't quite fit.
>

That's correct. The PEP is concerned with the existing notion of coroutines
in Python, which was first introduced by PEP 342: Coroutines via Enhanced
Generators. The Wikipedia definition of coroutine (which IIRC is due to
Knuth) is quite different and nobody who actually uses the coding style
introduced by PEP 342 should mistake one for the other.

 This same notion of "Pythonic" (so to speak) coroutines was refined by PEP
380, which introduced yield from. It was then *used* in PEP 3156 (the
asyncio package) for the specific purpose of standardizing a way to do I/O
multiplexing using an event loop.

The basic premise of using coroutines with the asyncio package is that most
of the time you can write *almost* sequential code as long as you insert
"yield from" in front of all blocking operations (and as long as you use
blocking operations that are implemented by or on top of the asyncio
package). This makes the code easier to follow than code written
"traditional" event-loop-based I/O multiplexing (which is heavy on
callbacks, or callback-like abstractions like Twisted's Deferred).

However, heavy users of the asyncio package (like Yury) discovered some
common patterns when using coroutines that were awkward. In particular,
"yield from" is quite a mouthful, the coroutine version of a for-loop is
awkward, and a with-statement can't have a blocking operation in __exit__
(because there's no explicit yield opcode). PEP 492 proposes a quite simple
and elegant solution for these issues. Most of the technical discussion
about the PEP is on getting the details right so that users won't have to
worry about them, and can instead just continue to write *almost*
sequential code when using the asyncio package (or some other framework
that offers an event loop integrated with coroutines).

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 vs. PEP 3152, new round

2015-04-30 Thread Ethan Furman
On 04/30, Guido van Rossum wrote:
> On Thu, Apr 30, 2015 at 9:15 AM, Ethan Furman wrote:
> 
>>  [...]
>>  Both you and Paul are correct on this, thank you.  The proper resolution
>>  of
>> 
>>await -coro()
>> 
>>  is indeed to get the result of coro(), call it's __neg__ method, and then
>>  await on that.
>> 
>>  And that is perfectly reasonable, and should not be a SyntaxError; what it
>>  might be is an AttributeError (no __neg__ method) or an AsyncError (__neg__
>>  returned non-awaitable object), or might even just work [1]... but it
>>  definitely should /not/ be a SyntaxError.
>> 
> 
> Why not? Unlike some other languages, Python does not have uniform
> priorities for unary operators, so it's reasonable for some unary
> operations to have a different priority than others, and certain things
> will be SyntaxErrors because of that. E.g. you can write "not -x" but you
> can't write "- not x".

For one, Yury's answer is "- await x" which looks just as nonsensical as
"- not x".

For another, an error of some type will be raised if either __neg__ doesn't
exist or it doesn't return an awaitable, so a SyntaxError is unnecessary.

For a third, by making it a SyntaxError you are forcing the use of parens to
get what should be the behavior anyway.

In other words, a SyntaxError is nat any clearer than "AttributeError: obj
has no __neg__ method" and it's not any clearer than "AwaitError: __neg__
returned not-awaitable".  Those last two errors tell you exactly what you
did wrong.

--
~Ethan~
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 vs. PEP 3152, new round

2015-04-30 Thread Yury Selivanov



On 2015-04-30 1:56 PM, Ethan Furman wrote:

>Why not? Unlike some other languages, Python does not have uniform
>priorities for unary operators, so it's reasonable for some unary
>operations to have a different priority than others, and certain things
>will be SyntaxErrors because of that. E.g. you can write "not -x" but you
>can't write "- not x".

For one, Yury's answer is "- await x" which looks just as nonsensical as
"- not x".



"- await x" is a perfectly valid code:

result = - await compute_in_db()

(same as "result = - (yield from do_something())")



For another, an error of some type will be raised if either __neg__ doesn't
exist or it doesn't return an awaitable, so a SyntaxError is unnecessary.

For a third, by making it a SyntaxError you are forcing the use of parens to
get what should be the behavior anyway.


I still want to see where my current grammar forces to use
parens.  See [1], there are no useless parens anywhere.

FWIW, I'll fix the 'await (await x)' expression to be parsed
without parens.



In other words, a SyntaxError is nat any clearer than "AttributeError: obj
has no __neg__ method" and it's not any clearer than "AwaitError: __neg__
returned not-awaitable".  Those last two errors tell you exactly what you
did wrong.


This is debatable. "obj has no __neg__ method" isn't obvious
to everyone (especially to those people who aren't using
operator overloading).


[1] https://www.python.org/dev/peps/pep-0492/#examples-of-await-expressions


Yury
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 vs. PEP 3152, new round

2015-04-30 Thread Ethan Furman
On 04/30, Yury Selivanov wrote:
> On 2015-04-30 1:56 PM, Ethan Furman wrote:

> I still want to see where my current grammar forces to use
> parens.  See [1], there are no useless parens anywhere.

  --> await -coro()
  SyntaxError
  --> await (-coro())  # not a SyntaxError, therefore parens are
   # forced


>> In other words, a SyntaxError is nat any clearer than "AttributeError: obj
>> has no __neg__ method" and it's not any clearer than "AwaitError: __neg__
>> returned not-awaitable".  Those last two errors tell you exactly what you
>> did wrong.
> 
> This is debatable. "obj has no __neg__ method" isn't obvious
> to everyone (especially to those people who aren't using
> operator overloading).

Good news!  The error there is actually

  --> -object()
  TypeError: bad operand type for unary -: 'object'

Which is definitely clear, even for those who don't do operator overloading.

--
~Ethan~
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492: What is the real goal?

2015-04-30 Thread Jim J. Jewett

On Wed Apr 29 20:06:23 CEST 2015,Yury Selivanov replied:

>> As best I can guess, the difference seems to be that a "normal"
>> generator is using yield primarily to say:

>>  "I'm not done; I have more values when you want them",

>> but an asynchronous (PEP492) coroutine is primarily saying:

>>  "This might take a while, go ahead and do something else meanwhile."

> Correct.

Then I strongly request a more specific name than coroutine.


I would prefer something that refers to cooperative pre-emption,
but I haven't thought of anything that is short without leading to
other types of confusion.

My least bad idea at the moment would be "self-suspending coroutine"
to emphasize that suspending themselves is a crucial feature.

Even "PEP492-coroutine" would be an improvement.


>> Does it really permit *making* them [asynchronous calls], or does
>> it just signal that you will be waiting for them to finish processing
>> anyhow, and it doesn't need to be a busy-wait?

> I does.

Bad phrasing on my part.  Is there anything that prevents an
asynchronous call (or waiting for one) without the "async with"?

If so, I'm missing something important.  Either way, I would
prefer different wording in the PEP.



>>> It uses the ``yield from`` implementation with an extra step of
>>> validating its argument.  ``await`` only accepts an *awaitable*, 
>>> which can be one of:

>> What justifies this limitation?

> We want to avoid people passing regular generators and random
> objects to 'await', because it is a bug.

Why?

Is it a bug just because you defined it that way?

Is it a bug because the "await" makes timing claims that an
object not making such a promise probably won't meet?  (In
other words, a marker interface.)

Is it likely to be a symptom of something that wasn't converted
correctly, *and* there are likely to be other bugs caused by
that same lack of conversion?

> For coroutines in PEP 492:

> __await__ = __anext__ is the same as __call__ = __next__
> __await__ = __aiter__ is the same as __call__ = __iter__

That tells me that it will be OK sometimes, but will usually
be either a mistake or an API problem -- and it explains why.

Please put those 3 lines in the PEP.


> This is OK. The point is that you can use 'await log' in
> __aenter__.  If you don't need awaits in __aenter__ you can
> use them in __aexit__. If you don't need them there too,
> then just define a regular context manager.

Is it an error to use "async with" on a regular context manager?
If so, why?  If it is just that doing so could be misleading,
then what about "async with mgr1, mgr2, mgr3" -- is it enough
that one of the three might suspend itself?

>> class AsyncContextManager:
>> def __aenter__(self):
>> log('entering context')


> __aenter__ must return an awaitable

Why?  Is there a fundamental reason, or it is just to avoid the
hassle of figuring out whether or not the returned object is a
future that might still need awaiting?

Is there an assumption that the scheduler will let the thing-being
awaited run immediately, but look for other tasks when it returns,
and a further assumption that something which finishes the whole
task would be too slow to run right away?

> It doesn't make any sense in using 'async with' outside of a
> coroutine.  The interpeter won't know what to do with them:
> you need an event loop for that.

So does the PEP also provide some way of ensuring that there is
an event loop?  Does it assume that self-suspending coroutines
will only ever be called by an already-running event loop
compatible with asyncio.get_event_loop()?  If so, please make
these contextual assumptions explicit near the beginning of the PEP.


>>> It is a ``TypeError`` to pass a regular iterable without ``__aiter__``
>>> method to ``async for``.  It is a ``SyntaxError`` to use ``async for``
>>> outside of a coroutine.

>> The same questions about why -- what is the harm?

I can imagine that as an implementation detail, the async for wouldn't
be taken advtange of unless it was running under an event loop that
knew to look for "aync for" as suspension points.

I'm not seeing what the actual harm is in either not happening to
suspend (less efficient, but still correct), or in suspending between
every step of a regular iterator (because, why not?)


>>> For debugging this kind of mistakes there is a special debug mode in
>>> asyncio, in which ``@coroutine`` 
...
>>> decorator makes the decision of whether to wrap or not to wrap based on
>>> an OS environment variable ``PYTHONASYNCIODEBUG``.

(1)  How does this differ from the existing asynchio.coroutine?
(2)  Why does it need to have an environment variable?  (Sadly,
 the answer may be "backwards compatibility", if you're really
 just specifying the existing asynchio interface better.)
(3)  Why does it need [set]get_coroutine_wrapper, instead of just
 setting the asynchio.coroutines.coroutine attribute?
(4)  Why do the get/set need to be in sys?

Is

Re: [Python-Dev] PEP 492: What is the real goal?

2015-04-30 Thread Paul Moore
On 29 April 2015 at 20:19, Paul Moore  wrote:
> However, just to make my point precise, here's a more or less direct
> translation of the Wikipedia code into Python. It doesn't actually
> work, because getting the right combinations of yield and send stuff
> is confusing to me. Specifically, I suspect that "yield
> produce.send(None)" isn't the right way to translate "yield to
> produce". But it gives the idea.

Hmm, when I try to fix this "minor" (as I thought!) issue with my
code, I discover it's more fundamental. The error I get is

Traceback (most recent call last):
  File ".\coro.py", line 28, in 
next(produce)
  File ".\coro.py", line 13, in produce
yield consume.send(None)
  File ".\coro.py", line 23, in consume
yield produce.send(None)
ValueError: generator already executing

What I now realise that means is that you cannot have producer send to
consumer which then sends back to producer. That's what the "generator
already executing" message means.

This is fundamentally different from the "traditional" use of
coroutines as described in the Wikipedia article, and as I thought was
implemented in Python. The Wikipedia example allows two coroutines to
freely yield between each other. Python, on the other hand, does not
support this - it requires the mediation of some form of "trampoline"
controller (or event loop, in asyncio terms) to redirect control. [1]

This limitation of Python's coroutines is not mentioned anywhere in
PEP 342, and that's probably why I never really understood Python
coroutines properly, as my mental model didn't match the
implementation.

Given that any non-trivial use of coroutines in Python requires an
event loop / trampoline, I begin to understand the logic behind
asyncio and this PEP a little better. I'm a long way behind in
understanding the details, but at least I'm no longer completely
baffled.

Somewhere, there should be an explanation of the difference between
Python's coroutines and Wikipedia's - I can't be the only person to be
confused like this. But I don't think there's any docs covering
"coroutines in Python" outside of PEP 342 - the docs just cover the
components (the send and throw methods, the yield expression, etc).
Maybe it could be covered in the send documentation (as that's what
gives the "generator already executing" error. I'll try to work up a
doc patch. Actually, looking at the docs, I can't even *find* where
the behaviour of the send method is defined - can someone point me in
the right direction?

Paul

[1] It's sort of similar to how Python doesn't do tail call
elimination. Symmetric yields rely on stack frames that are no longer
needed being discarded if they are to avoid unlimited recursion, so to
have symmetric yields, Python would need a form of tail call ("tail
yield", I guess :-)) elimination.
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492: What is the real goal?

2015-04-30 Thread Yury Selivanov

Jim,

On 2015-04-30 2:41 PM, Jim J. Jewett wrote:
[...]

Does it really permit *making* them [asynchronous calls], or does
it just signal that you will be waiting for them to finish processing
anyhow, and it doesn't need to be a busy-wait?

I does.

Bad phrasing on my part.  Is there anything that prevents an
asynchronous call (or waiting for one) without the "async with"?

If so, I'm missing something important.  Either way, I would
prefer different wording in the PEP.


Yes, you can't use 'yield from' in __exit__/__enter__
in current Python.


It uses the ``yield from`` implementation with an extra step of
validating its argument.  ``await`` only accepts an *awaitable*,
which can be one of:

What justifies this limitation?

We want to avoid people passing regular generators and random
objects to 'await', because it is a bug.

Why?

Is it a bug just because you defined it that way?

Is it a bug because the "await" makes timing claims that an
object not making such a promise probably won't meet?  (In
other words, a marker interface.)

Is it likely to be a symptom of something that wasn't converted
correctly, *and* there are likely to be other bugs caused by
that same lack of conversion?


Same as 'yield from' is expecting an iterable, await
is expecting an awaitable.  That's the protocol.

You can't pass random objects to 'with' statements,
'yield from', 'for..in', etc.

If you write

   def gen(): yield 1
   await gen()

then it's a bug.




For coroutines in PEP 492:
__await__ = __anext__ is the same as __call__ = __next__
__await__ = __aiter__ is the same as __call__ = __iter__

That tells me that it will be OK sometimes, but will usually
be either a mistake or an API problem -- and it explains why.

Please put those 3 lines in the PEP.


There is a line like that:
https://www.python.org/dev/peps/pep-0492/#await-expression
Look for "Also, please note..." line.


This is OK. The point is that you can use 'await log' in
__aenter__.  If you don't need awaits in __aenter__ you can
use them in __aexit__. If you don't need them there too,
then just define a regular context manager.

Is it an error to use "async with" on a regular context manager?
If so, why?  If it is just that doing so could be misleading,
then what about "async with mgr1, mgr2, mgr3" -- is it enough
that one of the three might suspend itself?


'with' requires an object with __enter__ and __exit__

'async with' requires an object with __aenter__ and __aexit__

You can have an object that implements both interfaces.




 class AsyncContextManager:
 def __aenter__(self):
 log('entering context')



__aenter__ must return an awaitable

Why?  Is there a fundamental reason, or it is just to avoid the
hassle of figuring out whether or not the returned object is a
future that might still need awaiting?


The fundamental reason why 'async with' is proposed is because
you can't suspend execution in __enter__ and __exit__.
If you need to suspend it there, use 'async with' and
its __a*__ methods, but they have to return awaitable
(see https://www.python.org/dev/peps/pep-0492/#new-syntax
and look what 'async with' is semantically equivalent to)



Is there an assumption that the scheduler will let the thing-being
awaited run immediately, but look for other tasks when it returns,
and a further assumption that something which finishes the whole
task would be too slow to run right away?


It doesn't make any sense in using 'async with' outside of a
coroutine.  The interpeter won't know what to do with them:
you need an event loop for that.

So does the PEP also provide some way of ensuring that there is
an event loop?  Does it assume that self-suspending coroutines
will only ever be called by an already-running event loop
compatible with asyncio.get_event_loop()?  If so, please make
these contextual assumptions explicit near the beginning of the PEP.


You need some kind of loop, but it doesn't have to the one
from asyncio.  There is at least one place in the PEP where
it's mentioned that the PEP introduses a generic concept
that can be used by asyncio *and* other frameworks.





It is a ``TypeError`` to pass a regular iterable without ``__aiter__``
method to ``async for``.  It is a ``SyntaxError`` to use ``async for``
outside of a coroutine.

The same questions about why -- what is the harm?

I can imagine that as an implementation detail, the async for wouldn't
be taken advtange of unless it was running under an event loop that
knew to look for "aync for" as suspension points.


Event loop doesn't need to know anything about 'async with'
and 'async for'. For loop it's always one thing -- something
is awaiting somewhere for some result.



I'm not seeing what the actual harm is in either not happening to
suspend (less efficient, but still correct), or in suspending between
every step of a regular iterator (because, why not?)



For debugging this kind of mistakes there is a special debug mode in
asyncio, in which ``@coroutine``

Re: [Python-Dev] PEP 492: What is the real goal?

2015-04-30 Thread Guido van Rossum
On Thu, Apr 30, 2015 at 11:41 AM, Jim J. Jewett 
wrote:

>
> On Wed Apr 29 20:06:23 CEST 2015,Yury Selivanov replied:
>
> >> As best I can guess, the difference seems to be that a "normal"
> >> generator is using yield primarily to say:
>
> >>  "I'm not done; I have more values when you want them",
>

This seems so vague as to be useless to me. When using generators to
implement iterators, "yield" very specifically means "here is the next
value in the sequence I'm generating". (And to indicate there are no more
values you have to use "return".)


> >> but an asynchronous (PEP492) coroutine is primarily saying:
>
> >>  "This might take a while, go ahead and do something else
> meanwhile."
>
> > Correct.
>

Actually that's not even wrong. When using generators as coroutines, PEP
342 style, "yield" means "I am blocked waiting for a result that the I/O
multiplexer is eventually going to produce". The argument to yield tells
the multiplexer what the coroutine is waiting for, and it puts the
generator stack frame on an appropriate queue. When the multiplexer has
obtained the requested result it resumes the coroutine by using send() with
that value, which resumes the coroutine/generator frame, making that value
the return value from yield.

Read Greg Ewing's tutorial for more color:
http://www.cosc.canterbury.ac.nz/greg.ewing/python/yield-from/yield_from.html

Then I strongly request a more specific name than coroutine.
>

No, this is the name we've been using since PEP 342 and it's still the same
concept.

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492: What is the real goal?

2015-04-30 Thread Antoine Pitrou
On Thu, 30 Apr 2015 12:32:02 -0700
Guido van Rossum  wrote:
> 
> No, this is the name we've been using since PEP 342 and it's still the same
> concept.

The fact that all syntax uses the word "async" and not "coro" or
"coroutine" hints that it should really *not* be called a coroutine
(much less a "native coroutine", which both silly and a lie).

Why not "async function"?

Regards

Antoine.


___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492: What is the real goal?

2015-04-30 Thread Paul Moore
On 30 April 2015 at 20:32, Guido van Rossum  wrote:
>> Then I strongly request a more specific name than coroutine.
>
> No, this is the name we've been using since PEP 342 and it's still the same
> concept.

However, it is (as I noted in my other email) not very well
documented. There isn't a glossary entry in the docs for "coroutine",
and there's nothing pointing out that coroutines need (for anything
other than toy cases) an event loop, trampoline, or IO multiplexer
(call it what you want, although I prefer terms that don't make it
sound like it's exclusively about IO).

I'll raise an issue on the tracker for this, and I'll see if I can
write up something. Once there's a non-expert's view in the docs, the
experts can clarify the technicalities if I get them wrong :-) I
propose a section under
https://docs.python.org/3/reference/expressions.html#yield-expressions
describing coroutines, and their usage.

Paul
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492: What is the real goal?

2015-04-30 Thread Guido van Rossum
It is spelled "Raymond Luxury-Yacht", but it's pronounced "Throatwobbler
Mangrove". :-)

I am actually fine with calling a function defined with "async def ..." an
async function, just as we call a function containing "yield" a generator
function.

However I prefer to still use "coroutine" to describe the concept
implemented by async functions. *Some* generator functions also implement
coroutines; however I would like to start a movement where eventually we'll
always be using async functions when coroutines are called for, dedicating
generators once again to their pre-PEP-342 role of a particularly efficient
way to implement iterators.

Note that I'm glossing over the distinction between yield and yield-from
here; both can be used to implement the coroutine pattern, but the latter
has some advantages when the pattern is used to support an event loop: most
importantly, when using yield-from-style coroutines, a coroutine can use
return to pass a value directly to the stack frame that is waiting for its
result. Prior to PEP 380 (yield from), the trampoline would have to be
involved in this step, and there was no standard convention for how to
communicate the final result to the trampoline; I've seen "returnValue(x)"
(Twisted inlineCallbacks), "raise ReturnValue(x)" (Google App Engine NDB),
"yield Return(x)" (Monocle) and I believe I've seen plain "yield x" too
(the latter two being abominations in my mind, since it's unclear whether
the generator is resumed after s value-returning yield).

While yield-from was an improvement over plain yield, await is an
improvement over yield-from. As with most changes to Python (as well as
natural evolution), an improvement often leads the way to another
improvement -- one that wasn't obvious before. And that's fine. If I had
laid awake worrying about the best way to spell async functions while
designing asyncio, PEP 3156 probably still wouldn't have been finished
today.

On Thu, Apr 30, 2015 at 12:40 PM, Antoine Pitrou 
wrote:

> On Thu, 30 Apr 2015 12:32:02 -0700
> Guido van Rossum  wrote:
> >
> > No, this is the name we've been using since PEP 342 and it's still the
> same
> > concept.
>
> The fact that all syntax uses the word "async" and not "coro" or
> "coroutine" hints that it should really *not* be called a coroutine
> (much less a "native coroutine", which both silly and a lie).
>
> Why not "async function"?
>
> Regards
>
> Antoine.
>
>
> ___
> Python-Dev mailing list
> [email protected]
> https://mail.python.org/mailman/listinfo/python-dev
> Unsubscribe:
> https://mail.python.org/mailman/options/python-dev/guido%40python.org
>



-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492: async/await in Python; version 4

2015-04-30 Thread Greg Ewing

Yury Selivanov wrote:

Well, using next() and iter() on coroutines in asyncio
code is something esoteric.  I can't even imagine
why you would want to do that.


I'm talking about the fact that existing generator-
based coroutines that aren't decorated with
@coroutine won't be able to call new ones that use
async def.

This means that converting one body of code to the
new style can force changes in other code that
interacts with it.

Maybe this is not considered a problem, but as long
as it's true, I don't think it's accurate to claim
"full backwards compatibility".

--
Greg
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492: async/await in Python; version 4

2015-04-30 Thread Yury Selivanov

On 2015-04-30 7:24 PM, Greg Ewing wrote:

Yury Selivanov wrote:

Well, using next() and iter() on coroutines in asyncio
code is something esoteric.  I can't even imagine
why you would want to do that.


I'm talking about the fact that existing generator-
based coroutines that aren't decorated with
@coroutine won't be able to call new ones that use
async def.



Ah, alright.

You quoted this:

3. CO_NATIVE_COROUTINE flag. This enables us to
disable __iter__ and __next__ on native coroutines
while maintaining full backwards compatibility.

I wrote "full backwards compatibility" for that
particular point #3 -- existing @asyncio.coroutines
will have __iter__ and __next__ working just fine.

Sorry if this was misleading.


This means that converting one body of code to the
new style can force changes in other code that
interacts with it.

Maybe this is not considered a problem, but as long
as it's true, I don't think it's accurate to claim
"full backwards compatibility".



I covered this in point #4.  I also touched this in
https://www.python.org/dev/peps/pep-0492/#migration-strategy


I'm still waiting for feedback on this from Guido.  If
he decides to go with RuntimeWarnings, then it's 100%
backwards compatible.  If we keep TypeErrors --
then *existing code will work on 3.5*, but something
*might* break during adopting new syntax.  I'll update
the Backwards Compatibility section.


Thanks,
Yury
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492: async/await in Python; version 4

2015-04-30 Thread Guido van Rossum
On Thu, Apr 30, 2015 at 4:24 PM, Greg Ewing 
wrote:

> Yury Selivanov wrote:
>
>> Well, using next() and iter() on coroutines in asyncio
>> code is something esoteric.  I can't even imagine
>> why you would want to do that.
>>
>
> I'm talking about the fact that existing generator-
> based coroutines that aren't decorated with
> @coroutine won't be able to call new ones that use
> async def.
>
> This means that converting one body of code to the
> new style can force changes in other code that
> interacts with it.
>
> Maybe this is not considered a problem, but as long
> as it's true, I don't think it's accurate to claim
> "full backwards compatibility".
>

Greg, you seem to have an odd notion of "full backwards compatibility". The
term means that old code won't break. It doesn't imply that old and new
code can always seamlessly interact (that would be an impossibly high bar
for almost any change).

That said, interoperability between old code and new code is an area of
interest. But if the only thing that's standing between old code and new
code is the @coroutine decorator, things are looking pretty good -- that
decorator is already strongly required for coroutines intended for use with
the asyncio package, and older versions of the asyncio package also define
that decorator, so if there's old code out there that needs to be able to
call the new coroutines (by whatever name, e.g. async functions :-), adding
the @coroutine decorator to the old code doesn't look like too much of a
burden.

I assume there might be code out there that uses yield-from-based
coroutines but does not use the asyncio package, but I doubt there is much
code like that (I haven't seen much mention of yield-from outside its use
in asyncio). So I think the interop problem is mostly limited to
asyncio-using code that plays loose with the @coroutine decorator
requirement and now wants to work with the new async functions. That's easy
enough to address.

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 vs. PEP 3152, new round

2015-04-30 Thread Greg

On 1/05/2015 5:38 a.m., Guido van Rossum wrote:

you can write "not -x" but you can't write "- not x".


That seems just as arbitrary and unintuitive, though.

There are some other unintuitive consequences as well, e.g.
you can write

   not a + b

but it's not immediately obvious that this is parsed as
'not (a + b)' rather than '(not a) + b'.

The presence of one arbitrary and unintuitive thing in the
grammar is not by itself a justification for adding another one.

--
Greg

___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 vs. PEP 3152, new round

2015-04-30 Thread Greg

On 1/05/2015 6:04 a.m., Yury Selivanov wrote:


I still want to see where my current grammar forces to use
parens.  See [1], there are no useless parens anywhere.


It's not about requiring or not requiring parens. It's about
making the simplest possible change to the grammar necessary
to achieve the desired goals. Keeping the grammar simple
makes it easy for humans to reason about.

The question is whether syntactically disallowing certain
constructs that are unlikely to be needed is a desirable
enough goal to be worth complicating the grammar. You think
it is, some others of us think it's not.


FWIW, I'll fix the 'await (await x)' expression to be parsed
without parens.


I don't particularly care whether 'await -x' or 'await await x'
can be written without parens or not. The point is that the
simplest grammar change necessary to be able to write the
things we *do* want also happens to allow those. I don't see
that as a problem worth worrying about.

--
Greg

___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 vs. PEP 3152, new round

2015-04-30 Thread Devin Jeanpierre
On Thu, Apr 30, 2015 at 6:13 PM, Greg  wrote:
> It's not about requiring or not requiring parens. It's about
> making the simplest possible change to the grammar necessary
> to achieve the desired goals. Keeping the grammar simple
> makes it easy for humans to reason about.
>
> The question is whether syntactically disallowing certain
> constructs that are unlikely to be needed is a desirable
> enough goal to be worth complicating the grammar. You think
> it is, some others of us think it's not.

+1. It seems weird to add a whole new precedence level when an
existing one works fine. Accidentally negating a future/deferred is
not a significant source of errors, so I don't get why that would be a
justifying example.

-- Devin
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 vs. PEP 3152, new round

2015-04-30 Thread Guido van Rossum
On Thu, Apr 30, 2015 at 6:56 PM, Devin Jeanpierre 
wrote:

> On Thu, Apr 30, 2015 at 6:13 PM, Greg  wrote:
> > It's not about requiring or not requiring parens. It's about
> > making the simplest possible change to the grammar necessary
> > to achieve the desired goals. Keeping the grammar simple
> > makes it easy for humans to reason about.
> >
> > The question is whether syntactically disallowing certain
> > constructs that are unlikely to be needed is a desirable
> > enough goal to be worth complicating the grammar. You think
> > it is, some others of us think it's not.
>
> +1. It seems weird to add a whole new precedence level when an
> existing one works fine. Accidentally negating a future/deferred is
> not a significant source of errors, so I don't get why that would be a
> justifying example.
>

You can call me weird, but I *like* fine-tuning operator binding rules to
suit my intuition for an operator. 'await' is not arithmetic, so I don't
see why it should be lumped in with '-'. It's not like the proposed grammar
change introducing 'await' is earth-shattering in complexity.

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 vs. PEP 3152, new round

2015-04-30 Thread Nathaniel Smith
On Apr 30, 2015 1:57 AM, "Greg Ewing"  wrote:
>
> Nathaniel Smith wrote:
>>
>> Even if we put aside our trained intuitions about arithmetic, I think
>> it's correct to say that the way unary minus is parsed is: everything
>> to the right of it that has a tighter precedence gets collected up and
>> parsed as an expression, and then it takes that expression as its
>> argument.
>
>
> Tighter or equal, actually: '--a' is allowed.
>
> This explains why Yury's syntax disallows 'await -f'.
> The 'await' operator requires something after it, but
> there's *nothing* between it and the following '-',
> which binds less tightly.
>
> So it's understandable, but you have to think a bit
> harder.
>
> Why do we have to think harder? I suspect it's because
> the notion of precedence is normally introduced to resolve
> ambiguities. Knowing that infix '*' has higher precedence
> than infix '+' tells us that 'a + b * c' is parsed as
> 'a + (b * c)' and not '(a + b) * c'.
>
> Similarly, knowing that infix '.' has higher precedence
> than prefix '-' tells us that '-a.b' is parsed as
> '-(a.b)' rather than '(-a).b'.
>
> However, giving prefix 'await' higher precedence than
> prefix '-' doesn't serve to resolve any ambiguity.
> '- await f' is parsed as '-(await f)' either way, and
> 'await f + g' is parsed as '(await f) + g' either way.
>
> So when we see 'await -f', we think we already know
> what it means. There is only one possible order for
> the operations, so it doesn't look as though precedence
> comes into it at all, and we don't consider it when
> judging whether it's a valid expression.

The other reason this threw me is that I've recently been spending time
with a shunting yard parser, and in shunting yard parsers unary prefix
operators just work in the expected way (their precedence only affects
their interaction with later binary operators; a chain of unaries is always
allowed). It's just a limitation of the parser generator tech that python
uses that it can't handle unary operators in the natural fashion. (OTOH it
can handle lots of cases that shunting yard parsers can't -- I'm not
criticizing python's choice of parser.) Once I read the new "documentation
grammar" this became much clearer.

> What's the conclusion from all this? I think it's
> that using precedence purely to disallow certain
> constructs, rather than to resolve ambiguities, leads
> to a grammar with less-than-intuitive characteristics.

The actual effect of making "await" a different precedence is to resolve
the ambiguity in
  await x ** 2

If await acted like -, then this would be
  await (x ** 2)
But with the proposed grammar, it's instead
  (await x) ** 2
Which is probably correct, and produces the IMHO rather nice invariant that
"await" binds more tightly than arithmetic in general (instead of having to
say that it binds more tightly than arithmetic *except* in this one corner
case...).

But then given the limitations of Python's parser plus the desire to
disambiguate the expression above in the given way, it becomes an arguably
regrettable, yet inevitable, consequence that
  await -fut
  await +fut
  await ~fut
become parse errors.

AFAICT these and the ** case are the only expressions where there's any
difference between Yury's proposed grammar and your proposal of treating
await like unary minus.

-n
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 vs. PEP 3152, new round

2015-04-30 Thread Guido van Rossum
On Thu, Apr 30, 2015 at 8:30 PM, Nathaniel Smith  wrote:

> The actual effect of making "await" a different precedence is to resolve
> the ambiguity in
>
>   await x ** 2
>
> If await acted like -, then this would be
>   await (x ** 2)
> But with the proposed grammar, it's instead
>   (await x) ** 2
> Which is probably correct, and produces the IMHO rather nice invariant
> that "await" binds more tightly than arithmetic in general (instead of
> having to say that it binds more tightly than arithmetic *except* in this
> one corner case...)
>
Correct.

> AFAICT these and the ** case are the only expressions where there's any
> difference between Yury's proposed grammar and your proposal of treating
> await like unary minus. But then given the limitations of Python's parser
> plus the desire to disambiguate the expression above in the given way, it
> becomes an arguably regrettable, yet inevitable, consequence that
>   await -fut
>   await +fut
>   await ~fut
> become parse errors.
>
 Why is that regrettable? Do you have a plan for overloading one of those
on Futures? I personally consider it a feature that you can't do that. :-)

-- 
--Guido van Rossum (python.org/~guido)
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 vs. PEP 3152, new round

2015-04-30 Thread Nathaniel Smith
On Apr 30, 2015 8:40 PM, "Guido van Rossum"  wrote:
>
> On Thu, Apr 30, 2015 at 8:30 PM, Nathaniel Smith  wrote:
>>
>> The actual effect of making "await" a different precedence is to resolve
the ambiguity in
>>
>>   await x ** 2
>>
>> If await acted like -, then this would be
>>   await (x ** 2)
>> But with the proposed grammar, it's instead
>>   (await x) ** 2
>> Which is probably correct, and produces the IMHO rather nice invariant
that "await" binds more tightly than arithmetic in general (instead of
having to say that it binds more tightly than arithmetic *except* in this
one corner case...)
>
> Correct.
>>
>> AFAICT these and the ** case are the only expressions where there's any
difference between Yury's proposed grammar and your proposal of treating
await like unary minus. But then given the limitations of Python's parser
plus the desire to disambiguate the expression above in the given way, it
becomes an arguably regrettable, yet inevitable, consequence that
>>
>>   await -fut
>>   await +fut
>>   await ~fut
>> become parse errors.
>
>  Why is that regrettable? Do you have a plan for overloading one of those
on Futures? I personally consider it a feature that you can't do that. :-)

I didn't say it was regrettable, I said it was arguably regrettable. For
proof, see the last week of python-dev ;-).

(I guess all else being equal it would be nice if unary operators could
stack arbitrarily, since that really is the more natural parse rule IMO and
also if things had worked that way then I would have spent this thread less
confused. But this is a pure argument from elegance. In practice there's
obviously no good reason to write "await -fut" or "-not x", so meh,
whatever.)

-n
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] Unicode literals in Python 2.7

2015-04-30 Thread Stephen J. Turnbull
Adam Bartoš writes:

 > Unfortunately, it doesn't work. With PYTHONIOENCODING=utf-8, the
 > sys.std* streams are created with utf-8 encoding (which doesn't
 > help on Windows since they still don't use ReadConsoleW and
 > WriteConsoleW to communicate with the terminal) and after changing
 > the sys.std* streams to the fixed ones and setting readline hook,
 > it still doesn't work,

I don't see why you would expect it to work: either your code is
bypassing PYTHONIOENCODING=utf-8 processing, and that variable doesn't
matter, or you're feeding already decoded text *as UTF-8* to your
module which evidently expects something else (UTF-16LE?).

 > so presumably the PyCF_SOURCE_IS_UTF8 is still not set.

I don't think that flag does what you think it does.  AFAICT from
looking at the source, that flag gets unconditionally set in the
execution context for compile, eval, and exec, and it is checked in
the parser when creating an AST node.  So it looks to me like it
asserts that the *internal* representation of the program is UTF-8
*after* transforming the input to an internal representation (doing
charset decoding, removing comments and line continuations, etc).

 > > Regarding your environment, the repeated use of "custom" is a red
 > > flag.  Unless you bundle your whole environment with the code you
 > > distribute, Python can know nothing about that.  In general, Python
 > > doesn't know what encoding it is receiving text in.
 > 
 > Well, the received text comes from sys.stdin and its encoding is
 > known.

How?  You keep asserting this.  *You* know, but how are you passing
that information to *the Python interpreter*?  Guido may have a time
machine, but nobody claims the Python interpreter is telepathic.

 > Ideally, Python would recieve the text as Unicode String object so
 > there would be no problem with encoding

Forget "ideal".  Python 3 was created (among other reasons) to get
closer to that ideal.  But programs in Python 2 are received as str,
which is bytes in an ASCII-compatible encoding, not unicode (unless
otherwise specified by PYTHONIOENCODING or a coding cookie in a source
file, and as far as I know that's the only ways to specify source
encoding).  This specification of "Python program" isn't going to
change in Python 2; that's one of the major unfixable reasons that
Python 2 and Python 3 will be incompatible forever.

 > The custom stdio streams and readline hooks are set at runtime by a
 > code in sitecustomize. It does not affect IDLE and it is compatible
 > with IPython. I would like to also set PyCF_SOURCE_IS_UTF8 at
 > runtime from Python e.g. via ctypes. But this may be impossible.

 > Yes. In the latter case, eval has no idea how the bytes given are
 > encoded.

Eval *never* knows how bytes are encoded, not even implicitly.  That's
one of the important reasons why Python 3 was necessary.  I think you
know that, but you don't write like you understand the implications
for your current work, which makes it hard to communicate.


___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com


Re: [Python-Dev] PEP 492 vs. PEP 3152, new round

2015-04-30 Thread Greg Ewing

Nathaniel Smith wrote:


If await acted like -, then this would be
  await (x ** 2)
But with the proposed grammar, it's instead
  (await x) ** 2


Ah, I had missed that!

This is a *good* argument for Yuri's grammar.
I withdraw my objection now.

--
Greg
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe: 
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com