Thanks. That cdef-locals concept is consistent with the following example:
def f():
i = 1
def g(): print('i' in globals(), 'i' in locals())
def h(): print('i' in globals(), 'i' in locals()); i
g()
h()
f()
False False
False True
It is a mystery, which may be why the documentat
I did a few tests
# test 1
def f():
i = 1
print(locals())
exec('y = i; print(y); print(locals())')
print(locals())
a = eval('y')
print(locals())
u = a
print(u)
f()
{'i': 1}
1
{'i': 1, 'y': 1}
{'i': 1, 'y': 1}
{'i': 1, 'y': 1, 'a': 1}
1
# test 2
def f():
i = 1
On 7/20/22, george trojan wrote:
>
> 1. This works as I expect it to work:
>
> def f():
> i = 1
> print(locals())
> exec('y = i; print(y); print(locals())')
> print(locals())
> exec('y *= 2')
> print('ok:', eval('y'))
> f()
In CPython, the locals of a function scope (as op
I wish I could understand the following behaviour:
1. This works as I expect it to work:
def f():
i = 1
print(locals())
exec('y = i; print(y); print(locals())')
print(locals())
exec('y *= 2')
print('ok:', eval('y'))
f()
{'i': 1}
1
{'i': 1, 'y': 1}
{'i': 1, 'y': 1}
ok: 2