On Sat, 1 Oct 2016 10:46 am, Gregory Ewing wrote: > Steve D'Aprano wrote: >> Giving for-loops their own namespace is a grossly unintuitive and a very >> weird thing to do. >> >> It would be terribly inconvenient and surprising for if...else blocks to >> be separate namespaces: > > There's an important difference between a for-loop and an > if-statement that's relevant here: a for-loop binds a name, > whereas an if-statement doesn't.
That's incorrect: either can bind any number of names: if condition: a = 1 b = 2 else: a = 222 b = 111 for i in seq: x = i + 1 y = x**2 z = 3*y > Whenever there's binding going on, it's necessary to decide > whether it should be creating a new binding or updating an > existing one. Right. > This is actually a *different* issue from one of scope. > List comprehensions were changed so that the loop variable > lives in a different scope from the containing function. > However, they still have the same unintuitive behaviour > with regard to capture of the loop variable by a lambda. > > >> l = [lambda: i for i in range(3)] > >>> for f in l: print(f()) > ... > 2 > 2 > 2 > > Most people consider *this* behaviour to be far weirder > than anything that would result from giving for-loop > variables their own scope. > > Even if you don't think it's weird, it's hard to argue > that it's *useful* in any significant number of cases. Firstly, let's agree (or perhaps we don't?) that loop variables are the same kind of variable as any other. It would be strange and confusing to have different kinds of variables with different binding and lookup rules based on where they came from. The whole list comprehension and lambda syntax is a red herring. Let's write it like this: alist = [] for i in (0, 1, 2): def func(): return i alist.append(func) for f in alist: print(f()) And the output is the same: 2 2 2 Okay, that's inconvenient and not what I wanted. Obviously. But as they say about computers, "this damn machine won't do what I want, only what I tell it to do". It did *exactly what I told it to do*, and, yes, that is a good thing. Let's remember that the variable i is no more special than any other variable: alist = [] x = 0 for i in (0, 1, 2): def func(): return x alist.append(func) x = 999 # What do you predict the functions will return? for f in alist: print(f()) Are you surprised that each of the func()'s return the same value, the current value of x instead of whatever accidental value of x happened to exist when the function was defined? I should hope not. I would expect that most people would agree that variable lookups should return the value of the variable *as it is now*, not as it was when the function was defined. Let's call that Rule 1, and I say it is fundamental to being able to reason about code. If I say: x = 1 spam(x) then spam() MUST see the current value of x, namely 1, not some mysterious value of x that happened to exist long ago in the mists of time when spam() was first defined. And Rule 1 needs to apply to ALL variable look ups, not just arguments to functions. Closures, of course, are a special case of Rule 1, not an exception: def make(arg): x = arg def func(): return x return func f = make(0) print(f()) *Without* closures, that would lead to a NameError (unless there happened to be a global called "x"): the local variables of make() no longer exist, so you cannot refer to them. That was the behaviour in Python 1.5, for example. But *with* closures, the local variables still exist: the inner function grabs the surrounding environment (or at least as much as it needs) and keeps it alive, so that it can look up names in that surrounding scope. What do you expect should happen here? def make(arg): x = arg def func(): return x x = 999 return func f = make(0) print(f()) By Rule 1, the only sensible behaviour is for f() to return 999, regardless of whether it is convenient or not, regardless of whether that is what I intended or not. The interpreter shouldn't try to guess what I mean, it shouldn't cache variable look-ups, and it shouldn't try to give x special behaviour different from all other variables just so this special case is more convenient. But wait, I hear you say, what about closures? They cache the value of the surrounding variables don't they? Perhaps in a sense, but only a weak sense. Each call to make() creates a *new* local environment (its a function, each time you call a function it starts completely afresh) so each of the inner functions returned is independent of the others, with its own independent closure. And each inner function still looks up the current value of *their own* closed-over x, not a snapshot of x as it was when the inner function was defined. Final step: let's unroll that original loop. l = [lambda: i for i in range(3)] for f in l: print(f()) becomes: l = [] i = 0 l.append(lambda: i) i = 1 l.append(lambda: i) i = 2 l.append(lambda: i) for f in l: print(f()) Are you still surprised that it prints 2 2 2? Is this still "unintuitive"? I say it isn't, and at the risk of coming across as smug and obnoxious I'm going to say that anyone who *claims* to still be surprised that the three lambda functions all print the same value for i, namely 2, is almost certainly playing ignorant because they're unwilling to admit that the actual behaviour of Python here is exactly what we should both expect and desire, if only we think about it rather than hoping for some magical Do What I Mean semantics. Of course all three functions print the same value -- they're all looking up the same variable, which can only have one value at a time. And the *last* thing we would want would be for functions to magically take a snapshot of variables' value at the time of function creation. (And no, closures do not violate that rule -- they're a special case, not an exception.) >> To me, "make for-loops be their own scope" sounds like a joke feature out >> of joke languages like INTERCAL. > > Which is a straw man, since that's not actually what we're > talking about doing. Jeez, how is it a strawman when the OP **specifically and explicitly** refers to making for-loops their own scope??? Quote: all along i assumed that for-loops got a namespace of their own ... has there ever been any discussion of giving for-loops the option of running in namespaces of their own? Donald Trump says that when elected president, he'll build a wall to keep Mexicans out of the US. When people call him out on this and explain how stupid that idea is, do you call it a strawman too? A strawman is a weak argument that your opponent DID NOT MAKE, not any weak argument. It is not a fallacy to directly challenge weak arguments. But false accusations of strawman is a fallacy: poisoning the well. > It's neither necessary nor sufficient > to solve the problem. Right. But that's not *my* problem. > What *is* necessary and sufficient is to make each iteration > of the for-loop create a new binding of the loop variable > (and not any other variable!). No, that's not right -- they're already new bindings. Variable bindings in Python are not in-place modifications of a variable like in C, where the variable is some bucket where you can modify the value in place. They are values bound to a key in a namespace. If you don't believe me, try this: alist = [] for i in (0, 1, 2): def func(): return i alist.append(func) if i == 2: continue del i Does it make *any* difference at all to the behaviour? Apart from being a tiny bit slower, no, of course it does not. >> I'm not aware of any sensible language that >> does anything like this. > > Scheme and Ruby come to mind as examples of languages in > which the equivalent of a for-loop results in each iteration > getting a new binding of the control variable. Although > you could argue that these languages are not "sensible". :-) :-) -- Steve “Cheer up,” they said, “things could be worse.” So I cheered up, and sure enough, things got worse. -- https://mail.python.org/mailman/listinfo/python-list