Re: [Python-Dev] Please reconsider PEP 479.
I'm not particularly opposed to PEP 479, but the Abstract and Rationale could do with considerable clarification. They currently appear to promise things that are in disagreement with what the PEP actually delivers. The Abstract claims that the proposal will "unify the behaviour of list comprehensions and generator expressions", but it doesn't do that. What it actually does is provide special protection against escaped StopIteration exceptions in one particular context (the body of a generator). It doesn't prevent StopIteration from escaping anywhere else, including from list comprehensions, so if anything it actually *increases* the difference between generators and comprehensions. There may be merit in preventing rogue StopIterations escaping from generators, but the PEP should sell the idea on that basis, not on what sounds like a false promise that it will make comprehensions and generators behave identically. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
On 23/04/2015 6:32 a.m., Andrew Svetlov wrote: If we forbid to call `async def` from regualr code how asyncio should work? I'd like to push `async def` everywhere in asyncio API where asyncio.coroutine required. As I suggested earlier, a way could be provided to mark a function as callable using either yield from f() or await f(). That would water down the error catching ability a bit, but it would allow interoperability with existing asyncio code. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 492: async/await in Python; v3
On 29/04/2015 9:49 a.m., Guido van Rossum wrote: c) 'yield from' only accept coroutine objects from generators decorated with 'types.coroutine'. That means that existing asyncio generator-based coroutines will happily yield from both coroutines and generators. *But* every generator-based coroutine *must* be decorated with `asyncio.coroutine()`. This is potentially a backwards incompatible change. See below. I worry about backward compatibility. A lot. Are you saying that asycio-based code that doesn't use @coroutine will break in 3.5? That seems unavoidable if the goal is for 'await' to only work on generators that are intended to implement coroutines, and not on generators that are intended to implement iterators. Because there's no way to tell them apart without marking them in some way. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 492 vs. PEP 3152, new round
On 1/05/2015 5:38 a.m., Guido van Rossum wrote: you can write "not -x" but you can't write "- not x". That seems just as arbitrary and unintuitive, though. There are some other unintuitive consequences as well, e.g. you can write not a + b but it's not immediately obvious that this is parsed as 'not (a + b)' rather than '(not a) + b'. The presence of one arbitrary and unintuitive thing in the grammar is not by itself a justification for adding another one. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 492 vs. PEP 3152, new round
On 1/05/2015 6:04 a.m., Yury Selivanov wrote: I still want to see where my current grammar forces to use parens. See [1], there are no useless parens anywhere. It's not about requiring or not requiring parens. It's about making the simplest possible change to the grammar necessary to achieve the desired goals. Keeping the grammar simple makes it easy for humans to reason about. The question is whether syntactically disallowing certain constructs that are unlikely to be needed is a desirable enough goal to be worth complicating the grammar. You think it is, some others of us think it's not. FWIW, I'll fix the 'await (await x)' expression to be parsed without parens. I don't particularly care whether 'await -x' or 'await await x' can be written without parens or not. The point is that the simplest grammar change necessary to be able to write the things we *do* want also happens to allow those. I don't see that as a problem worth worrying about. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Enable access to the AST for Python code
On 22/05/2015 1:33 p.m., Ethan Furman wrote:
Going back to the OP:
select(c for c in Customer if sum(c.orders.price) > 1000)
which compile into and run SQL like this:
SELECT "c"."id"
FROM "Customer" "c"
LEFT JOIN "Order" "order-1" ON "c"."id" = "order-1"."customer"
GROUP BY "c"."id"
HAVING coalesce(SUM("order-1"."total_price"), 0) > 1000
That last code is /not/ Python. ;)
More importantly, it's not Python *semantics*. You can't view
it as simply a translation of the Python expression into a
different language.
I still think this is really a macro facility by a different
name. I'm not saying that's a bad thing, just pointing it out.
The main difference is that a macro would (or at least could)
be expanded at compile time, whereas this would require
processing the AST each time it's used.
--
Greg
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Deleting with setting C API functions
On 3/12/2015 5:41 a.m., Random832 wrote: Why bother with the dot? Why not rename 3.5 to Python 5, and then go to Python 6, etc, and then your "4.0" would be 10. Then we could call it Python X! Everything is better with an X in the name. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 461 updates
On 17/01/2014 10:18 a.m., Terry Reedy wrote: On 1/16/2014 5:11 AM, Nick Coghlan wrote: Guido's successful counter was to point out that the parsing of the format string itself assumes ASCII compatible data, Nick's initial arguments against bytes formatting were very abstract and philosophical, along the lines that it violated some pure mental model of text/bytes separation. Then Guido said something that Nick took to be an equal and opposite philosophical argument that cancelled out his original objections, and he withdrew them. I don't think it matters whether the internal details of that debate make sense to the rest of us. The main thing is that a consensus seems to have been reached on bytes formatting being basically a good thing. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 435 -- Adding an Enum type to the Python standard library
On 26/04/2013 1:15 p.m., Ethan Furman wrote: Lots of counting systems wrap: seconds, minutes, hours, days of week, days of month, days of year, millimeters, inches, etc., etc. But we don't disagree on which is the first minute of an hour. We still apply ordering to them, and talk about 15 being less than 42 . When we do that, we're using cardinal numbers (how many), not ordinal numbers (what order). -- Greg ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 435 -- Adding an Enum type to the Python standard library
On 26/04/2013 2:38 p.m., Guido van Rossum wrote:
Can't we do some kind of callable check? There may be some weird
decorators that won't work, but they aren't likely to be useful in
this context.
Another possible solution:
class Color:
red = 1
white = 2
blue = 3
orange = 4
class __methods__:
def wave(self, n=1):
for _ in range(n):
print('Waving', self)
and have the metaclass pull the functions out of the __methods__
sub-object.
--
Greg
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 435 -- Adding an Enum type to the Python standard library
On 26/04/2013 1:28 p.m., Ethan Furman wrote: Interesting idea, but why does Day(3) have to be disallowed to make it work? Because it's ambiguous. Which day of the week is number 3? It depends on where you start. I should perhaps point out that the numbers assigned to the values initially are just to establish the relative ordering. They wouldn't be directly accessible once the values are created. To get an integer value corresponding to a Day value, you would have to do arithmetic: Day.wednesday - Day.sunday --> 3 -- Greg ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 435 -- Adding an Enum type to the Python standard library
On 26/04/2013 3:06 p.m., Glenn Linderman wrote:
So what I'm hearing is that enumerations need to be a language feature,
rather than a module:
Can't combine Enum and EnumItem
Can't import into locals
The compiler could do those things, though.
Maybe we've found a use case for the recently advertised
macro system?
("One, two, five..." runs for cover...)
--
Greg
___
Python-Dev mailing list
[email protected]
http://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 435 -- Adding an Enum type to the Python standard library
On 26/04/2013 3:12 p.m., Glenn Linderman wrote: On 4/25/2013 7:49 PM, Nick Coghlan wrote: You couldn't create an enum of callables, but that would be a seriously weird thing to do anyway But aren't all classes callable? An enum of classes would be seriously weird as well, I think. -- Greg ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 450 adding statistics module
On 9/09/2013 5:52 a.m., Guido van Rossum wrote: Well, to me zip(*x) is unnatural, and it's inefficient when the arrays are long. Would it be worth having a transpose() function in the stdlib somewhere, that returns a view instead of copying the data? -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Coroutines and PEP 380
Glyph wrote: [Guido] mentions the point that coroutines that can implicitly switch out from under you have the same non-deterministic property as threads: you don't know where you're going to need a lock or lock-like construct to update any variables, so you need to think about concurrency more deeply than if you could explicitly always see a 'yield'. I'm not convinced that being able to see 'yield's will help all that much. In any system that makes substantial use of generator-based coroutines, you're going to see 'yield from's all over the place, from the lowest to the highest levels. But that doesn't mean you need a correspondingly large number of locks. You can't look at a 'yield' and conclude that you need a lock there or tell what needs to be locked. There's no substitute for deep thought where any kind of theading is involved, IMO. -- Greg ___ Python-Dev mailing list [email protected] http://mail.python.org/mailman/listinfo/python-dev Unsubscribe: http://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Exposing the Android platform existence to Python modules
Shiz wrote: I'm not sure a check to see if e.g. /system exists is really enough to conclude Python is running on Android on its own. Since MacOSX has /System and typically a case-insensitive file system, it certainly wouldn't. :-) -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] sum(...) limitation
Steven D'Aprano wrote: I've long believed that + is the wrong operator for concatenating strings, and that & makes a much better operator. Do you have a reason for preferring '&' in particular, or do you just want something different from '+'? Personally I can't see why "bitwise and" on strings should be a better metaphor for concatenation that "addition". :-) -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Bytes path support
Ben Hoyt wrote: Does that mean that new APIs should explicitly not support bytes? > ... Bytes paths are essentially broken on Windows. But on Unix, paths are essentially bytes. What's the official policy for dealing with that? -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Bytes path support
Stephen J. Turnbull wrote: This case can be handled now using the surrogateescape error handler, So maybe the way to make bytes paths go away is to always use surrogateescape for paths on unix? -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Bytes path support
Antoine Pitrou wrote: I think if you want low-level features (such as unconverted bytes paths under POSIX), it is reasonable to point you to low-level APIs. The problem with scandir() in particular is that there is currently *no* low-level API exposed that gives the same functionality. If scandir() is not to support bytes paths, I'd suggest exposing the opendir() and readdir() system calls with bytes path support. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Bytes path support
Isaac Morland wrote: In HTML 5 it allows non-ASCII-compatible encodings as long as U+FEFF (byte order mark) is used: http://www.w3.org/TR/html-markup/syntax.html#encoding-declaration Not sure about XML. According to Appendix F here: http://www.w3.org/TR/xml/#sec-guessing an XML parser needs to be prepared to try all the encodings it supports until it finds one that works well enough to decode the XML declaration, then it can find out the exact encoding used. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] surrogatepass - she's a witch, burn 'er! [was: Cleaning up ...]
M.-A. Lemburg wrote: we needed a way to make sure that Python 3 also optionally supports working with lone surrogates in such UTF-8 streams (nowadays called CESU-8: http://en.wikipedia.org/wiki/CESU-8). I don't think CESU-8 is the same thing. According to the wiki page, CESU-8 *requires* all code points above 0x to be split into surrogate pairs before encoding. It also doesn't say that lone surrogates are valid -- it doesn't mention lone surrogates at all, only pairs. Neither does the linked technical report. The technical report also says that CESU-8 forbids any UTF-8 sequences of more than three bytes, so it's definitely not "UTF-8 plus lone surrogates". -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] RFC: PEP 475, Retry system calls failing with EINTR
Victor Stinner wrote: As written in the PEP, if you want to be notified of the signal, set a signal handler which raises an exception. I'm not convinced that this covers all possible use cases. It might be all right if you have control over the signal handler, but what if you don't? I think it's best if the functions in the os module remain thin wrappers that expose the OS functionality as fully and directly as possible. Anything else should be provided by higher-level facilities. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] RFC: PEP 475, Retry system calls failing with EINTR
Victor Stinner wrote: Le 1 sept. 2014 00:17, "Marko Rauhamaa" <mailto:[email protected]>> a écrit : > If a signal is received when read() or write() has completed its task > partially (> 0 bytes), no EINTR is returned but the partial count. > Obviously, Python should take that possibility into account so that > raising an exception in the signal handler (as mandated by the PEP) > doesn't cause the partial result to be lost on os.read() or os.write(). This case is unrelated to the PEP, the PEP only changes the behaviour when a syscall fails with EINTR. I think there's a problem here, though. As thing stand, a signal handler that doesn't raise an exception can set a flag, and code after the read() can test it. Under the proposed scheme, the signal handler has to be made to raise an exception so that the read will be broken out of in the EINTR case. But what happens if the read returns *without* an EINTR? The signal handler will still raise an exception, which is either going to clobber the partial return value or mess up the code that does something with it. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 394 - Clarification of what "python" command should invoke
Donald Stufft wrote: My biggest problem with ``python3``, is what happens after 3.9. Python2 technically includes 1.x versions as well, so it wouldn't be unprecedented for python3 to imply versions beyond 3.x. It would be a bit confusing, though. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 394 - Clarification of what "python" command should invoke
Barry Warsaw wrote: On Sep 19, 2014, at 08:40 AM, Guido van Rossum wrote: Until I say so. Which will happen in the distant future. I'm gonna hid your time machine keys so you didn't find them. Hiding someone's time machine keys never works. Chances are he's already taken a trip to the future in which you get kidnapped and tortured until you reveal where you hid them, and then nipped over there to take them back. Which means he *might* be able to avoid carrying out the actual torture now, as long as it doesn't create too big a temporal paradox. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] bytes-like objects
anatoly techtonik wrote: That's a cool stuff. `bytes-like object` is really a much better name for users. I'm not so sure. Usually when we talk about an "xxx-like object" we mean one that supports a certain Python interface, e.g. a "file-like object" is one that has read() and/or write() methods. But you can't create an object that supports the buffer protocol by implementing Python methods. I'm worried that using the term "bytes-like object" will lead people to ask "What methods do I have to implement to make my object bytes-like?", to which the answer is "mu". -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] bytes-like objects
I wrote: But you can't create an object that supports the buffer protocol by implementing Python methods. Another thing is that an object implementing the buffer interface doesn't have to look anything at all like a bytes object from Python, so calling it "bytes-like" could be rather confusing. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Status of C compilers for Python on Windows
Nick Coghlan wrote: That assumption will allow MinGW-w64 to link with the appropriate MSVCRT versions for extention building without anything breaking. If that works, then the same technique should allow CPython itself to be built in a VS-compatible way with mingw, shouldn't it? Those objecting to a mingw-built python seem to be assuming that such a thing will necessarily be incompatible with VS builds, but I don't see why that has to be the case. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Real-world use of Counter
Ethan Furman wrote: Actually, it's asking, "Most other duck-typed methods will still raise a TypeError, but these few don't. Has that ever been a problem for you?" I don't think I've *ever* been bothered by getting an AttributeError instead of a TypeError or vice versa. Both indicate bugs in my code, and I debug it by looking at the code and traceback; I don't try to guess the problem based solely on the exception type. In this case the code would have to go out of its way to turn an AttributeError into a TypeError. I don't think the cost of that is worth whatever small benefit there might be, if any. Summary: Looks fine to me as it is. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Please reconsider PEP 479.
Guido van Rossum wrote: Hm, that sounds like you're either being contrarian or Chris and I have explained it even worse than I thought. I'm not trying to be contrary, I just think the PEP could explain more clearly what you're trying to achieve. The rationale is too vague and waffly at the moment. Currently, there are cases where list(x for x in xs if P(x)) works while [x for x in xs if P(x)] fails (when P(x) raises StopIteration). With the PEP, both cases will raise some exception That's a better explanation, I think. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 479 and asyncio
Guido van Rossum wrote: The issue here is that asyncio only interprets StopIteration as returning from the generator (with a possible value), while a Trollius coroutine must use "raise Return()" to specify a return value; this works as long as Return is a subclass of StopIteration, but PEP 479 will break this by replacing the StopIteration with RuntimeError. I don't understand. If I'm interpreting PEP 479 correctly, in 'x = yield from foo', a StopIteration raised by foo.__next__() doesn't get turned into a RuntimeError; rather it just stops the sub-iteration as usual and its value attribute gets assigned to x. As long as a Trollius coroutine behaves like something implementing the iterator protocol, it should continue to work fine with Return as a subclass of StopIteration. Or is there something non-obvious about Trollius that I'm missing? -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] advice needed: best approach to enabling "metamodules"?
Nathaniel Smith wrote: Option 4: Add a new function sys.swap_module_internals, which takes two module objects and swaps their __dict__ and other attributes. By making the operation a swap instead of an assignment, we avoid the lifecycle pitfalls from Option 3. Didn't I see somewhere that module dicts are not being cleared on shutdown any more? If so, then the lifetime problem mentioned here no longer exists. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] advice needed: best approach to enabling "metamodules"?
Guido van Rossum wrote: Are these really all our options? All of them sound like hacks, none of them sound like anything the language (or even the CPython implementation) should sanction. If assignment to the __class__ of a module were permitted (by whatever means) then you could put this in a module: class __class__(types.ModuleType): ... which makes it look almost like a deliberate language feature. :-) Seriously, of the options presented, I think that allowing __class__ assignment is the most elegant, since it solves a lot of problems in one go without introducing any new features -- just removing a restriction that prevents an existing language mechanism from working in this case. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] advice needed: best approach to enabling "metamodules"?
Nathaniel Smith wrote: So pkgname/__new__.py might look like: import sys from pkgname._metamodule import MyModuleSubtype sys.modules[__name__] = MyModuleSubtype(__name__, docstring) To start with, the 'from pkgname._metamodule ...' line is an infinite loop, Why does MyModuleSubtype have to be imported from pkgname? It would make more sense for it to be defined directly in __new__.py, wouldn't it? Isn't the purpose of separating stuff out into __new__.py precisely to avoid circularities like that? -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] advice needed: best approach to enabling "metamodules"?
Steven D'Aprano wrote: If this feature is going to be used, I would expect to be able to re-use pre-written module types. E.g. having written "module with properties" (so to speak) once, I can just import it and use it in my next project. There would be nothing to stop __new__.py importing it from another module, as long as it's not any of the modules that are going to be using it. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] datetime nanosecond support (ctd?)
MRAB wrote: Maybe, also, strptime could support "%*f" to gobble as many digits as are available. The * would suggest that the number of digits is being supplied as a parameter. Maybe "%?f". -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] bytes & bytearray
Guido van Rossum wrote: On Mon, Jan 19, 2015 at 11:43 AM, Paul Sokolovsky <mailto:[email protected]>> wrote: b.lower_inplace() b.lower_i() Please don't go there. The use cases are too rare. And if you have such a use case, it's not too hard to do b[:] = b.lower() -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Why does STORE_MAP not take a parameter?
On 01/21/2015 11:16 PM, Neil Girdhar wrote: Why not have BUILD_MAP work like BUILD_LIST? I.e., STORE_MAP takes a parameter n and adds the last n pairs of stack elements into the n-1 stack element (the dictionary). It probably wouldn't make much difference. Building a list is substantially cheaper if you have all the items on hand and can copy them in en masse. But adding an item to a dict entails quite a lot of overhead, since you need to hash the key, look for a free slot, etc. and this would likely swamp any gain from executing less bytecodes. And on the other side of the equation, evaluating the items one at a time requires less stack space, so the stack frame can be smaller. But as always, you can't be sure without measuring it, and this would be a good thing for someone interested to try out. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Disassembly of generated comprehensions
Petr Viktorin wrote: On Sun, Jan 25, 2015 at 12:55 PM, Neil Girdhar wrote: How do I disassemble a generated comprehension? Put it in a function, then get it from the function's code's constants. It would be handy if dis had an option to disassemble nested functions like this automatically. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Why co_names? Wouldn't be simpler to just use co_consts?
Andrea Griffini wrote: Sorry if the question is naive, but why is co_names needed? Wouldn't be simpler to just use co_consts? One reason might be that keeping them separate means you can have up to 256 names and 256 consts using 1-byte opcode arguments. Otherwise, you'd be limited to a total of 256 of both. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Encoding of PyFrameObject members
Maciej Fijalkowski wrote: However, you can't access thread locals from signal handlers (since in some cases it mallocs, thread locals are built lazily if you're inside the .so, e.g. if python is built with --shared) You might be able to use Py_AddPendingCall to schedule what you want done outside the context of the signal handler. The call will be made by the main thread, though, so if you need to access the frame of whatever thread was running when the signal occured, you will have to track down its PyThreadState somehow and get the frame from there. Not sure what would be involved in doing that. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] (no subject)
Donald Stufft wrote: why is: print(*[1], *[2], 3) better than print(*[1] + [2] + [3])? It could potentially be a little more efficient by eliminating the construction of an intermediate list. defining + or | or some other symbol for something similar to [1] + [2] but for dictionaries. This would mean that you could simply do: func(**dict1 | dict(y=1) | dict2) Same again, multiple ** avoids construction of an itermediate dict. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] (no subject)
Donald Stufft wrote: However [*item for item in ranges] is mapped more to something like this: result = [] for item in iterable: result.extend(*item) Actually it would be result.extend(item) But if that bothers you, you could consider the expansion to be result = [] for item in iterable: for item1 in item: result.append(item) In other words, the * is shorthand for an extra level of looping. and it acts differently than if you just did *item outside of a list comprehension. Not sure what you mean by that. It seems closely analogous to the use of * in a function call to me. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] (no subject)
Donald Stufft wrote: perhaps a better solution is to simply make it so that something like ``a_list + an_iterable`` is valid and the iterable would just be consumed and +’d onto the list. I don't think I like the asymmetry that this would introduce into + on lists. Currently [1, 2, 3] + (4, 5, 6) is an error because it's not clear whether the programmer intended the result to be a list or a tuple. I think that's a good thing. Also, it would mean that [1, 2, 3] + foo == [1, 2, 3, "f", "o", "o"] which would be surprising and probably not what was intended. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] (no subject)
Donald Stufft wrote: 1. The statement *item is roughly the same thing as (item[0], item[1], item[n]) No, it's not -- that would make it equivalent to tuple(item), which is not what it means in any of its existing usages. What it *is* roughly equivalent to is item[0], item[1], item[n] i.e. *without* the parens, whatever that means in the context concerned. In the context of a function call, it has the effect of splicing the sequence in as if you had written each item out as a separate expression. You do have a valid objection insofar as this currently has no meaning at all in a comprehension, i.e. this is a syntax error: [item[0], item[1], item[n] for item in items] So we would be giving a meaning to something that doesn't currently have a meaning, rather than changing an existing meaning, if you see what I mean. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] (no subject)
Victor Stinner wrote: Le 10 févr. 2015 06:48, "Greg Ewing" <mailto:[email protected]>> a écrit : > It could potentially be a little more efficient by > eliminating the construction of an intermediate list. Is it the case in the implementation? If it has to create a temporary list/tuple, I will prefer to not use it. The function call machinery will create a new tuple for the positional args in any case. But if you manually combine your * args into a tuple before calling, there are *two* tuple allocations being done. Passing all the * args directly into the call would allow one of them to be avoided. Similarly for dicts and ** args. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] (no subject)
Antoine Pitrou wrote: bytearray(b"a") + b"bc" bytearray(b'abc') b"a" + bytearray(b"bc") b'abc' It's quite convenient. It's a bit disconcerting that the left operand wins, rather than one of them being designated as the "wider" type, as occurs with many other operations on mixed types, e.g. int + float. In any case, these seem to be special-case combinations. It's not so promiscuous as to accept any old iterable on the right: >>> b"a" + [1,2,3] Traceback (most recent call last): File "", line 1, in TypeError: can't concat bytes to list >>> [1,2,3] + b"a" Traceback (most recent call last): File "", line 1, in TypeError: can only concatenate list (not "bytes") to list -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] (no subject)
John Wong wrote: I am actually amazed to remember dict + dict is not possible... there must be a reason (performance??) for this... I think it's mainly because there is no obviously correct answer to the question of what to do about duplicate keys. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] (no subject)
Georg Brandl wrote:
The call syntax part is a mixed bag: on the one hand it is nice to be
consistent with the extended possibilities in literals (flattening),
but on the other hand there would be small but annoying inconsistencies
anyways (e.g. the duplicate kwarg case above).
That inconsistency already exists -- duplicate keys are
allowed in dict literals but not calls:
>>> {'a':1, 'a':2}
{'a': 2}
--
Greg
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] subclassing builtin data structures
Isaac Schwabacher wrote: IIUC, the argument is that the Liskov Substitution Principle is a statement about how objects of a subtype behave relative to objects of a supertype, and it doesn't apply to constructors because they aren't behaviors of existing objects. Another way to say that is that constructors are class methods, not instance methods. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 441 - Improving Python ZIP Application Support
Paul Moore wrote: The alternative, I guess, is to have *no* default and write no shebang unless -p is specified. +1. That sounds like a very good idea to me. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Emit SyntaxWarning on unrecognized backslash escapes?
Chris Angelico wrote: Then he changed the code over to use his own file instead of the provided sample, and at the same time, switched from using open() to using csv.reader(open()), and moved all the code into a function, and fixed three other bugs, and now it isn't working. And he can't figure out why. There's probably a useful meta-lesson in there: test after making every change! -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] Emit SyntaxWarning on unrecognized backslash escapes?
Thomas Wouters wrote: Trying to make the error messages more similar, or more identifying, may be a good idea (as long as they aren't misleading when people *meant* to use escape sequences in a string) It seems that Windows won't let you use control characters in filenames, so there is room for a more pointed error message from functions that take pathnames on Windows. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] super() does not work during class initialization
Martin Teichmann wrote: maybe we could just change the compiler to leave the order in which things are defined in a class in the class namespace, say as a member __order__? Then we could use plain-old dicts for the class namespace, and we would not slow down class creation (not that it matters much), as determining the order would happen at compile time. I don't think the compiler can determine the order in all cases. Consider: class Spam: if moon_is_full: alpha = 1 beta = 2 else: beta = 2 alpha = 1 -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 487 vs 422 (dynamic class decoration)
On 04/03/2015 02:31 PM, Nick Coghlan wrote: If I'm understanding PJE's main concern correctly it's that this approach requires explicitly testing that the decorator has been applied correctly in your automated tests every time you use it, as otherwise there's a risk of a silent failure when you use the decorator but omit the mandatory base class that makes the decorator work correctly. Could the decorator be designed to detect that situation somehow? E.g. the first time the decorated method is called, check that the required base class is present. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 487 vs 422 (dynamic class decoration)
Eric Snow wrote: I've felt for a long time that it would be helpful in some situations to have a reverse descriptor protocol. Can you elaborate on what you mean by that? -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
Yury Selivanov wrote: 1. CO_ASYNC flag was renamed to CO_COROUTINE; 2. sys.set_async_wrapper() was renamed to sys.set_coroutine_wrapper(); 3. New function: sys.get_coroutine_wrapper(); 4. types.async_def() renamed to types.coroutine(); I still don't like the idea of hijacking the generic term "coroutine" and using it to mean this particular type of object. 2. I propose to disallow using of 'for..in' loops, and builtins like 'list()', 'iter()', 'next()', 'tuple()' etc on coroutines. PEP 3152 takes care of this automatically from the fact that you can't make an ordinary call to a cofunction, and cocall combines a call and a yield-from. You have to go out of your way to get hold of the underlying iterator to use in a for-loop, etc. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
Yury Selivanov wrote: On the other hand, I hate the idea of grammatically requiring parentheses for 'await' expressions. That feels non-pytonic to me. How is it any different from grammatically requiring parens in an ordinary function call? Nobody ever complained about that. In the PEP 3152 way of thinking, a cocall is just a function call that happens to be suspendable. The fact that there is an iterator object involved behind the scenes is an implementation detail. You don't have to think about it or even know about it in order to write or understand suspendable code. It's possible to think about "yield from f(x)" or "await f(x)" that way, but only by exploiting a kind of pun in the code, where you think of f(x) as doing all the work and the rest as a syntactic marker indicating that the call is suspendable. PEP 3152 removes the pun by making this the *actual* interpretation of "cocall f(x)". -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
Guido van Rossum wrote: On Wed, Apr 22, > OTOH I'm still struggling with what you have to do to wrap a coroutine in a Task, the way its done in asyncio by the Task() constructor, the loop.create_task() method, and the async() function That's easy. You can always use costart() to adapt a cofunction for use with something expecting a generator-based coroutine, e.g. codef my_task_func(arg): ... my_task = Task(costart(my_task_func, arg)) If you're willing to make changes, Task() et al could be made to recognise cofunctions and apply costart() where needed. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
On 04/23/2015 04:18 AM, Yury Selivanov wrote: 2. We'll hack Gen(/ceval.c?) objects to raise an error if they are called directly and have a 'CO_COROUTINE' flag. By "Gen", do you mean the generator-function or the generator-iterator? That flag has to be on the generator-function, not the generator-iterator, otherwise by the time ceval sees it, the call that should have been forbidden has already been made. To make this work without flagging the function, it would be necessary to check the result of every function call that wasn't immediately awaited and raise an exception if it were awaitable. But that would mean awaitable objects not being fully first-class citizens, since there would be some perfectly reasonable things that you can't do with them. I suspect it would make writing the kernel of a coroutine-scheduling system such as asyncio very awkward, perhaps impossible, to write in pure Python. 3. Task(), create_task() and async() will be modified to call 'coro.__await__(..)' if 'coro' has a 'CO_COROUTINE' flag. Or, as I pointed out earlier, the caller can wrap the argument in something equivalent to costart(). 4. 'await' will require parentheses grammatically. That will make it different from 'yield' expression. For instance, I still don't know what would 'await coro(123)()' mean. In PEP 3152, cocall binds to the nearest set of function-calling parens, so 'cocall f()()' is parsed as '(cocall f())()'. If you want it the other way, you have to write it as 'cocall (f())()'. I know that's a somewhat arbitrary thing to remember, and it makes chained function calls a bit harder to write and read. But chaining calls like that is a fairly rare thing to do, in contrast with using a call expression as an argument to another call, which is very common. That's not the only case, either. Just about any unparenthesised use of yield-from other than the sole contents of the RHS of an assignment seems to be disallowed. All of these are currently syntax errors, for example: yield from f(x) + yield from g(x) x + yield from g(x) [yield from f(x)] 5. 'await foo(*a, **k)' will be an equivalent to 'yield from type(coro).__await__(coro, *a, **k)' Again, I'm not sure whether you're proposing to make the functions the await-able objects rather than the iterators (which would effectively be PEP 3152 with __cocall__ renamed to __await__) or something else. I won't comment further on this point until that's clearer. 6. If we ever decide to implement coroutine-generators -- async def functions with 'await' *and* some form of 'yield' -- we'll need to reverse the rule -- allow __call__ and disallow __await__ on such objects (so that you'll be able to write 'async for item in coro_gen()' instead of 'async for item in await coro_gen()'. Maybe. I haven't thought that idea through properly yet. Possibly the answer is that you define such a function using an ordinary "def", to match the way it's called. The fact that it's an async generator is then indicated by the fact that it contains "async yield". -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
Yury Selivanov wrote: I think there is another way... instead of pushing GET_ITER ... YIELD_FROM opcodes, we'll need to replace GET_ITER with another one: GET_ITER_SPECIAL ... YIELD_FROM I'm lost. What Python code are you suggesting this would be generated from? -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
PJ Eby wrote: I find this a little weird. Why not just have `with` and `for` inside a coroutine dynamically check the iterator or context manager, and either behave sync or async accordingly? Why must there be a *syntactic* difference? It depends on whether you think it's important to have a syntactic marker for points where the code can potentially be suspended. In my original vision for PEP 3152, there was no "cocall" syntax -- you just wrote an ordinary call, and whether to make a cocall or not was determined at run time. But Guido and others felt that it would be better for suspension points to be explicit, so I ended up with cocall. The same reasoning presumably applies to asynchronous 'for' and 'with'. If you think that it's important to make suspendable calls explicit, you probably want to mark them as well. ...which, incidentally, highlights one of the things that's been bothering me about all this "async foo" stuff: "async def" looks like it *defines the function* asynchronously That bothers me a bit, too, but my main problem with it is the way it displaces the function name. "def f() async:" would solve both of those problems. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
Yury Selivanov wrote: - If it's an object with __await__, return iter(object.__await__()) Is the iter() really needed? Couldn't the contract of __await__ be that it always returns an iterator? -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
Victor Stinner wrote: A huge part of the asyncio module is based on "yield from fut" where fut is a Future object. How do you write this using the PEP 3152? Do you need to call an artifical method like "cocall fut.return_self()" where the return_self() method simply returns fut? In a PEP 3152 world, Future objects and the like would be expected to implement __cocall__, just as in a PEP 492 world they would be expected to implement __await__. @asyncio.coroutine currently calls a function and *then* check if it should yields from it or not: res = func(*args, **kw) if isinstance(res, futures.Future) or inspect.isgenerator(res): res = yield from res To accommodate the possibility of func being a cofunction, you would need to add something like if is_cofunction(func): res = yield from costart(func, *args, **kw) else: # as above -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
Victor Stinner wrote: Using a custom name like "cofunction" may confuse users coming from other programming languages. I prefer to keep "coroutine", but I agree that we should make some effort to define the different categories of "Python coroutines". I should perhaps point out that "cofunction" is not just an arbitrary word I made up to replace "coroutine". It is literally a kind of function, and is meant to be thought of that way. As for confusing new users, I would think that, as an unfamiliar word, it would point out that there is something they need to look up and learn about. Whereas they may think they already know what a "coroutine" is and not bother to look further. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
Ludovic Gasc wrote: Not related, but one of my coworkers asked me if with the new syntax it will be possible to write an async decorator for coroutines. This is certainly possible with PEP 3152. The decorator just needs to be an ordinary function whose return value is a cofunction. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
Paul Sokolovsky wrote: And having both asymmetric and symmetric would quite confusing, especially that symmetric are more powerful and asymmetric can be easily implemented in terms of symmetric using continuation-passing style. You can also use a trampoline of some kind to relay values back and forth between generators, to get something symmetric. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
Yury Selivanov wrote: So how would we do "await fut" if await requires parentheses? I've answered this with respect to PEP 3152 -- futures would implement __cocall__, so you would write 'cocall fut()'. I'm not sure what to say about PEP 492 here, because it depends on exactly what a version of await that "requires parentheses" would mean. It's not clear to me what you have in mind for that from what you've said so far. I'm not really in favour of just tweaking the existing PEP 492 notion of await so that it superficially resembles a PEP 3152 cocall. That misses the point, which is that a cocall is a special kind of function call, not a special kind of yield-from. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
Yury Selivanov wrote: I think that the problem of forgetting 'yield from' is a bit exaggerated. Yes, I myself forgot 'yield from' once or twice. But that's it, it has never happened since. I think it's more likely to happen when you start with an ordinary function, then discover that it needs to be suspendable, so you need to track down all the places that call it, and all the places that call those, etc. PEP 3152 ensures that you get clear diagnostics if you miss any. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
Yury Selivanov wrote: So you would have to write 'await fut()'? This is non-intuitive. That's because PEP 492 and its terminology encourage you to think of 'await f()' as a two-step process: evaluate f(), and then wait for the thing it returns to produce a result. PEP 3152 has a different philosophy. There, 'cocall f()' is a one-step process: call f and get back a result (while being prepared to get suspended in the meantime). The two-step approach has the advantage that you can get hold of the intermediate object and manipulate it. But I don't see much utility in being able to do that. Keep in mind that you can treat cofunctions themselves as objects to be manipulated, just like you can with ordinary functions, and all the usual techniques such as closures, * and ** parameters, etc. are available if you want to encapsulate one with some arguments. About the only thing you gain from being able to pass generator-iterators around instead of the functions that produce them is that you get to write t = Task(func(args)) instead of t = Task(func, args) which seems like a very minor thing to me. I would even argue that the latter is clearer, because it makes it very obvious that the body of func is *not* executed before the Task is constructed. The former makes it look as though the *result* of executing func with args is being passed to Task, rather than func itself. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
Andrew Svetlov wrote: From my understanding to use cofunctions I must wrap it with costart call: yield from gather(costart(coro1, a1, a2), costart(coro2), fut3) There are other places in asyncio API those accept coroutines or futures as parameters, not only Task() and async(). In a PEP 3152 aware version of asyncio, they would all know about cofunctions and what to do with them. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
Paul Sokolovsky wrote: Greg Ewing wrote: You can also use a trampoline of some kind to relay values back and forth between generators, to get something symmetric. Yes, that's of course how coroutine frameworks were done long before "yield from" appeared and how Trollius works now. No, what I mean is that if you want to send stuff back and forth between two particular coroutines in a symmetric way, you can write a specialised scheduler that just handles those coroutines. If you want to do that at the same time that other things are going on, I think you're better off not trying to do it using yield. Use a general scheduler such as asyncio, and some traditional IPC mechanism such as a queue for communication. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 3152 and yield from Future()
Victor Stinner wrote: I'm still trying to understand how the PEP 3152 would impact asyncio. Guido suggests to replace "yield from fut" with "cocall fut()" (add parenthesis) and so add a __cocall__() method to asyncio.Future. Problem: PEP 3152 says "A cofunction (...) does not contain any yield or yield from expressions". A __cocall__ method doesn't have to be implemented with a cofunction. Any method that returns an iterator will do, including a generator. So a Future.__cocall__ that just invokes Future.__iter__ should work fine. How is it possible to suspend a cofunction if it's not possible to use yield? The currently published version of PEP 3152 is not really complete. A few things would need to be added to it, one of them being a suspend() builtin that has the same effect as yield in a generator-based coroutine. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
Andrew Svetlov wrote: But we already have asyncio and code based on asyncio coroutines. To make it work I should always use costart() in places where asyncio requires coroutine. As I understand it, asyncio would require changes to make it work seamlessly with PEP 492 as well, since an object needs to have either a special flag or an __await__ method before it can have 'await' applied to it. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 3152 and yield from Future()
[email protected] wrote: I can live with `cocall fut()` but the difference between `data = yield from loop.sock_recv(sock, 1024)` and `data = cocall (loop.sock_recv(sock, 1024))()` frustrates me very much. That's not the way it would be done. In a PEP-3152-ified version of asyncio, sock_recv would be a cofunction, so that would be just data = cocall loop.sock_recv(sock, 1024) -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 3152 and yield from Future()
Victor Stinner wrote: You didn't answer to my question. My question is: is it possible to implement Future.__cocall__() since yield is defined in cofunctions. If it's possible, can you please show how? (Show me the code!) The implementation of a __cocall__ method is *not* a cofunction, it's an *ordinary* function that returns an iterator. In the case of Future, what it needs to do is identical to Future.__iter__. So the code can be just def __cocall__(self): return iter(self) or equivalent. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 3152 and yield from Future()
Yury Selivanov wrote: Another problem is functions that return future: def do_something(): ... return fut With Greg's idea to call it you would do: cocall (do_something())() That means that you can't refactor your "do_something" function and make it a coroutine. There's no fundamental problem with a cofunction returning another cofunction: codef do_something(): return fut f = cocall do_something() result = cocall f() Combining those last two lines into one would require some extra parenthesisation, but I don't think that's something you're going to be doing much in practice. If you're just going to immediately call the result, there's no point in returning a future -- just do it all in do_something(). -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
Barry Warsaw wrote: Sure, tools can be updated but it is it *necessary* to choose a syntax that breaks tools? def async useful(): seems okay to me. That will break any tool that assumes the word following 'def' is the name of the function being defined. Putting it at the end would seem least likely to cause breakage: def useful() async: -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
Yury Selivanov wrote: If you have a future object 'fut', it's not intuitive or pythonic to write 'cocall fut()'. Another way to approach that would be to provide a cofunction await() used like this: cocall await(fut) That would read more naturally and wouldn't require modifying fut at all. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
Yury Selivanov wrote: In a PEP 3152 aware version of asyncio, it's just *not possible to write* cocall gather(coro1(1,2), coro(2,3)) you just have to use your 'costart' built-in: cocall gather(costart(coro1, 1, 2), costart(coro, 2,3)). Another way to write that would be cocall gather(Task(coro1, 1, 2), Task(coro, 2, 3)) I think that actually reads quite nicely, and makes it very clear that parallel tasks are being spawned, rather than invoked sequentially. With the current way, that's not clear at all. It's not quite as convenient, because you don't get currying for free the way you do with generators. But I feel that such implicit currying is detrimental to readability. It looks like you're passing the results returned by coro1 and coro2 to gather, rather than coro1 and coro2 themselves. Yes, it will require some code to be changed, but if you're turning all your coroutines into cofunctions or async defs, you're changing quite a lot of things already. PEP 3152 was created in pre-asyncio era, and it shows. I would say that asyncio was created in a pre-PEP-3152 world. Or at least it was developed without allowing for the possibility of adopting something like PEP 3152 in the future. Asyncio was based on generators and yield-from because it was the best thing we had at the time. I'll be disappointed if we've raced so far ahead with those ideas that it's now impossible to replace them with anything better. PEP 3152 is designed to present a *simpler* model of coroutine programming, by having only one concept, the suspendable function, instead of two -- generator functions on the one hand, and iterators/futures/awaitables/ whatever you want to call them on the other. PEP 492 doesn't do that. It adds some things and changes some things, but it doesn't simplify anything. Your idea of syntaticaly forcing to use 'cocall' with parens is cute, You say that as though "forcing" the use of parens were a goal in itself. It's not -- it's a *consequence* of what a cocall is. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 3152 and yield from Future()
Guido van Rossum wrote: I think this is the nail in PEP 3152's coffin. Seems more like a small tack to me. :-) I've addressed all the issues raised there in earlier posts. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 3152 and yield from Future()
Victor Stinner wrote: Oh, I missed something in the PEP 3152: a obj__cocall__() method can be an iterator/generator, it can be something different than a cofunction. In fact, it *can't* be cofunction. It's part of the machinery for implementing cofunctions. It's not easy to understand the whole puzzle. IMO the PEP 492 better explains how pieces are put together ;-) Yes, it's written in a rather minimal style, sorry about that. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 3152 and yield from Future()
Yury Selivanov wrote: It's a common pattern in asyncio when functions return futures. It's OK later to refactor those functions to coroutines *and* vice-versa. This is a fundamental problem for PEP 3152 approach. Hmmm. So you have an ordinary function that returns a future, and you want to turn it into a coroutine function, but still have it return a future in order to keep the API the same, is that right? Turning it into a coroutine means you're going to have to change every site that calls it, so its API has already changed. Given that, I'm not sure what advantage there is in keeping the future- returning part of the API. However, if we use the await()-cofunction idea, then a call to the initial version looks like cocall await(f(x)) and after the refactoring it becomes cocall await(cocall f(x)) That doesn't look so bad to me. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
Stephen J. Turnbull wrote: Yury Selivanov writes: > I also read "for async item in iter:" as "I'm iterating iter > with async item". I thought that was precisely the intended semantics: item is available asynchronously. The async-at-the-end idea could be used here as well. for item in iter async: ... with something as x async: ... -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 3152 and yield from Future()
Paul Moore wrote: On 24 April 2015 at 09:34, Greg Ewing wrote: cocall await(cocall f(x)) That doesn't look so bad to me. I've not been following this discussion (and coroutines make my head hurt) but this idiom looks like it's bound to result in people getting the idea that you scatter "cocall" throughout an expression until you get it to work. They won't need to do that, because they'll get told exactly where they've left one out, or put one in that they shouldn't have. Also, the places you need to put cocall are exactly the same as the places you need yield-from currently, or await under PEP 492. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 492 vs. PEP 3152, new round
Guido van Rossum wrote: Yury, could you tweak the syntax for `await` so that we can write the most common usages without parentheses? In particular I'd like to be able to write ``` return await foo() with await foo() as bar: ... foo(await bar(), await bletch()) ``` Making 'await' a prefix operator with the same precedence as unary minus would allow most reasonable usages, I think. The only reason "yield from" has such a constrained syntax is that it starts with "yield", which is similarly constrained. Since 'await' is a brand new keyword isn't bound by those constraints. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] async/await in Python; v2
Wild idea: Let "@" mean "async" when it's directly in front of a keyword. Then we would have: @def f(): ... @for x in iter: ... @with context as thing: ... -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 492 vs. PEP 3152, new round
Victor Stinner wrote: That's why I suggest to reconsider the idea of supporting an *optional* "from __future__ import async" to get async and await as keywords in the current file. This import would allow all crazy syntax. The parser might suggest to use the import when it fails to parse an async or await keyword :-) To me, these new features *obviously* should require a __future__ import. Anything else would be crazy. I accept the compromise of creating a coroutine object without wait for it (obvious and common bug when learning asyncio). Hopefully, we keep the coroutine wrapper feature (ok, maybe I suggested this idea to Yury because I suffered so much when I learnt how to use asyncio ;-)), so it will still be easy to emit a warning in debug mode. I'm disappointed that there will *still* be no direct and reliable way to detect and clearly report this kind of error, and that what there is will only be active in a special debug mode. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 492 vs. PEP 3152, new round
Guido van Rossum wrote: Sorry, when I wrote "future" (lower-case 'f') I really meant what Yury calls *awaitable*. That's either a coroutine or something with an __await__ emthod. But how is an awaitable supposed to raise StopIteration if it's implemented by a generator or async def[*] function? Those things use StopIteration to wrap return values. I like the idea of allowing StopIteration to be raised in an async def function and wrapping it somehow. I'd add that it could also be unwrapped automatically when it emerges from 'await', so that code manually invoking __anext__ can catch StopIteration as usual. I don't think this could conflict with any existing uses of StopIteration, since raising it inside generators is currently forbidden. [*] I'm still struggling with what to call those things. Calling them just "coroutines" seems far too ambiguous. (There should be a Zen item something along the lines of "If you can't think of a concise and unambiguous name for it, it's probably a bad idea".) -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 492 vs. PEP 3152, new round
Victor Stinner wrote: It's now time to focus our good energy on discussing remaining questions on the PEP 492 to make it the best PEP ever! That's what I'm trying to do. I just think it would be even better if it could be made to address that issue somehow. I haven't thought of a way to do that yet, though. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 492 vs. PEP 3152, new round
Yury Selivanov wrote:
I've done some experiments with grammar, and it looks like
we indeed can parse await quite differently from yield. Three
different options:
You don't seem to have tried what I suggested, which is
to make 'await' a unary operator with the same precedence
as '-', i.e. replace
factor: ('+'|'-'|'~') factor | power
with
factor: ('+'|'-'|'~'|'await') factor | power
That would allow
await a()
res = await a() + await b()
res = await await a()
if await a(): pass
return await a()
print(await a())
func(arg=await a())
await a() * b()
--
Greg
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 492 vs. PEP 3152, new round
Yury Selivanov wrote:
I don't want this: "await a() * b()" to be parsed, it's not meaningful.
Why not? If a() is a coroutine that returns a number,
why shouldn't I be able to multiply it by something?
I don't think your currently proposed grammar prevents
that anyway. We can have
--> '*'
--> '*'
--> '*'
--> 'await' * '*' *
--> 'await' 'a' '(' ')' '*' 'b' '(' ')'
It does, on the other hand, seem to prevent
x = - await a()
which looks perfectly sensible to me.
I don't like the idea of introducing another level
of precedence. Python already has too many of those
to keep in my brain. Being able to tell people "it's
just like unary minus" makes it easy to explain (and
therefore possibly a good idea!).
--
Greg
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 492 vs. PEP 3152, new round
Yury Selivanov wrote: Looking at the grammar -- the only downside of the current approach is that you can't do 'await await fut'. I still think that it reads better with parens. If we put 'await' to 'factor' terminal we would allow await -fut # await (-fut) Is there really a need to disallow that? It would take a fairly bizarre API to make it meaningful in the first place, but in any case, it's fairly clear what order of operations is intended without the parens. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 492: async/await in Python; v3
Yury Selivanov wrote:
It's important to at least have 'iscoroutine' -- to check that
the object is a coroutine function. A typical use-case would be
a web framework that lets you to bind coroutines to specific
http methods/paths:
@http.get('/spam')
async def handle_spam(request):
...
>
The other thing is that it's easy to implement this function
for CPython: just check for CO_COROUTINE flag.
But isn't that too restrictive? Any function that returns
an awaitable object would work in the above case.
One of the most frequent mistakes that people make when using
generators as coroutines is forgetting to use ``yield from``::
I think it's a mistake that a lot of beginners may make at some
point (and in this sense it's frequent). I really doubt that
once you were hit by it more than two times you would make it
again.
What about when you change an existing non-suspendable
function to make it suspendable, and have to deal with
the ripple-on effects of that? Seems to me that affects
everyone, not just beginners.
3. ``yield from`` does not accept coroutine objects from plain Python
generators (*not* generator-based coroutines.)
What exactly are "coroutine
objects
from plain Python generators"?)
# *Not* decorated with @coroutine
def some_algorithm_impl():
yield 1
yield from native_coroutine() # <- this is a bug
So what you really mean is "yield-from, when used inside
a function that doesn't have @coroutine applied to it,
will not accept a coroutine object", is that right? If
so, I think this part needs re-wording, because it sounded
like you meant something quite different.
I'm not sure I like this -- it seems weird that applying
a decorator to a function should affect the semantics
of something *inside* the function -- especially a piece
of built-in syntax such as 'yield from'. It's similar
to the idea of replacing 'async def' with a decorator,
which you say you're against.
BTW, by "coroutine object", do you mean only objects
returned by an async def function, or any object having
an __await__ method? I think a lot of things would be
clearer if we could replace the term "coroutine object"
with "awaitable object" everywhere.
``yield from`` does not accept *native coroutine objects*
from regular Python generators
It's the "from" there that's confusing -- it sounds
like you're talking about where the argument to
yield-from comes from, rather than where the yield-from
expression resides. In other words, we though you were
proposing to disallow *this*:
# *Not* decorated with @coroutine
def some_algorithm_impl():
yield 1
yield from iterator_implemented_by_generator()
I hope to agree that this is a perfectly legitimate
thing to do, and should remain so?
--
Greg
___
Python-Dev mailing list
[email protected]
https://mail.python.org/mailman/listinfo/python-dev
Unsubscribe:
https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 492: async/await in Python; v3
Guido van Rossum wrote: There's a cost to __future__ imports too. The current proposal is a pretty clever hack -- and we've done similar hacks in the past There's a benefit to having a __future__ import beyond avoiding hackery: by turning on the __future__ you can find out what will break when they become real keywords. But I suppose that could be achieved by having both the hack *and* the __future__ import available. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 492 vs. PEP 3152, new round
Guido van Rossum wrote: I don't care for await await x. But do you dislike it enough to go out of your way to disallow it? -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 492: async/await in Python; v3
Yury Selivanov wrote: I also like Guido's suggestion to use "native coroutine" term. I'll update the PEP (I have several branches of it in the repo that I need to merge before the rename). I'd still prefer to avoid use of the word "coroutine" altogether as being far too overloaded. I think even the term "native coroutine" leaves room for ambiguity. It's not clear to me whether you intend it to refer only to functions declared with 'async def', or to any function that returns an awaitable object. The term "async function" seems like a clear and unabmigious way to refer to the former. I'm not sure what to call the latter. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 492: async/await in Python; v3
Guido van Rossum wrote: On Mon, Apr 27, 2015 at 8:07 PM, Yury Selivanov <mailto:[email protected]>> wrote: > Why StopAsyncIteration? ''''''''''''''''''''''' I keep wanting to propose to rename this to AsyncStopIteration. +1, that seems more consistent to me too. And since PEP 479 is accepted and enabled by default for coroutines, the following example will have its ``StopIteration`` wrapped into a ``RuntimeError`` I think that's a red herring in relation to the reason for StopAsyncIteration/AsyncStopIteration being needed. The real reason is that StopIteration is already being used to signal returning a value from an async function, so it can't also be used to signal the end of an async iteration. One of the most frequent mistakes that people make when using generators as coroutines is forgetting to use ``yield from``:: @asyncio.coroutine def useful(): asyncio.sleep(1) # this will do noting without 'yield from' Might be useful to point out that this was the one major advantage of PEP 3152 -- although it wasn't enough to save that PEP, and in your response you pointed out that this mistake is not all that common. Although you seem to disagree with that here ("One of the most frequent mistakes ..."). I think we need some actual evidence before we can claim that one of these mistakes is more easily made than the other. A priori, I would tend to assume that failing to use 'await' when it's needed would be the more insidious one. If you mistakenly treat the return value of a function as a future when it isn't one, you will probably find out about it pretty quickly even under the old regime, since most functions don't return iterators. On the other hand, consider refactoring a function that was previously not a coroutine so that it now is. All existing calls to that function now need to be located and have either 'yield from' or 'await' put in front of them. There are three possibilities: 1. The return value is not used. The destruction-before- iterated-over heuristic will catch this (although since it happens in a destructor, you won't get an exception that propagates in the usual way). 2. Some operation is immediately performed on the return value. Most likely this will fail, so you will find out about the problem promptly and get a stack trace, although the error message will be somewhat tangentially related to the cause. 3. The return value is stored away for later use. Some time later, an operation on it will fail, but it will no longer be obvious where the mistake was made. So it's all a bit of a mess, IMO. But maybe it's good enough. We need data. How often have people been bitten by this kind of problem, and how much trouble did it cause them? Does send() make sense for a native coroutine? Check PEP 380. I think the only way to access the send() argument is by using ``yield`` but that's disallowed. Or is this about send() being passed to the ``yield`` that ultimately suspends the chain of coroutines? That's made me think of something else. Suppose you want to suspend execution in an 'async def' function -- how do you do that if 'yield' is not allowed? You may need something like the suspend() primitive that I was thinking of adding to PEP 3152. No implicit wrapping in Futures --- There is a proposal to add similar mechanism to ECMAScript 7 [2]_. A key difference is that JavaScript "async functions" always return a Promise. While this approach has some advantages, it also implies that a new Promise object is created on each "async function" invocation. I don't see how this is different from an 'async def' function always returning an awaitable object, or a new awaitable object being created on each 'async def' function invocation. Sounds pretty much isomorphic to me. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
Re: [Python-Dev] PEP 492: async/await in Python; v3
Yury Selivanov wrote: On 2015-04-28 11:59 PM, Greg wrote: On 29/04/2015 9:49 a.m., Guido van Rossum wrote: *But* every generator-based coroutine *must* be decorated with `asyncio.coroutine()`. This is potentially a backwards incompatible change. See below. I worry about backward compatibility. A lot. Are you saying that asycio-based code that doesn't use @coroutine will break in 3.5? That seems unavoidable if the goal is for 'await' to only work on generators that are intended to implement coroutines, Not sure what you mean by "unavoidable". Guido is worried about existing asyncio-based code that doesn't always decorate its generators with @coroutine. If I understand correctly, if you have @coroutine def coro1(): yield from coro2() def coro2(): yield from ... then coro1() would no longer work. In other words, some currently legitimate asyncio-based code will break under PEP 492 even if it doesn't use any PEP 492 features. What you seem to be trying to do here is catch the mistake of using a non-coroutine iterator as if it were a coroutine. By "unavoidable" I mean I can't see a way to achieve that in all possible permutations without giving up some backward compatibility. -- Greg ___ Python-Dev mailing list [email protected] https://mail.python.org/mailman/listinfo/python-dev Unsubscribe: https://mail.python.org/mailman/options/python-dev/archive%40mail-archive.com
