On Jun 27, 2019, at 08:59, nate lust <[email protected]> wrote:
>
> There are two (three) important cases when I have exempt __getself__ from
> being called. First is when an
> object is used in methods defined within itself. This means that self can be
> used when defining methods
> without triggering recursive behavior.
What happens in cases like the way operators are defined in Fraction (which the
docs suggest is the best way to implement operators for your own custom numeric
types):
def _operator_fallbacks(monomorphic_operator, fallback_operator):
def forward(a, b):
# complicated stuff
# ...
def reverse(b, a):
# more…
# …
return forward, backward
def _add(a, b):
da. db = a.denominator, b.denominator
return Fraction(a.numerator * db + b.numerator * da, da * db))
__add__. __radd__ = _operator_fallbacks(_add, operator._add)
# similar fallbacks calls for all the other ops
The forward and reverse functions are not defined at class scope, but as locals
within a class-scope function (one which isn’t called as a method), and they
don’t call their arguments self, but they do get bound to the class names
__add__ and __radd__ and eventually called as methods. So, does some part of
that trigger the magic so that the a argument in forward and the a (or is it
b?) argument in reverse get exempted, or not? If so, does it also trigger for
the other argument iff a is b?
I’m not even sure which behavior I’d want, much less which I think your code
will do based on your description.
Presumably this would come up in many real-world expression-template libraries.
At least it does in those that already exist, like SymPy, where this:
x = sympy.Symbol('x')
y = 2 + x
… calls x.__radd__(2) which returns a sympy.Add object using roughly similar
code. (There’s no way to capture a “plain” value like 2 as part of an
expression otherwise.)
> The other cases are calling or returning from a function.
If calling a function is magic, but using an operator isn’t, doesn’t that mean
that operator.add(a, b) is no longer equivalent to a+b (the whole reason the
operator module exists), and np.matmul(x, y) and x @ y, and so on?
> This is
> to ensure the following say consistent.
>
> f1():
> x = MetaVar()
> return x
>
> f2():
> return MetaVar()
But if it’s, say, a yield expression instead of a return statement, they’re not
consistent anymore, right? So this fixes one common refactoring step, but only
by making it work differently from other very similar refactoring steps.
> In f1 the return function evaluates if its return argument is the result of a
> metavar __getself__ call and if
> so, returns the metavar instead.
What if your __getself__ is there for side effects rather than for replacing
the value, as in your counter and logger examples? In that case, throwing away
the return value (which is just x anyway) doesn’t make the refactoring
idempotent; the extra side effects still happen.
Also, what about these cases:
return x or None # x is truthy
return y or x # y is falsey
return x if spam else y # spam is truthy
return fixup(x)
return x + y # y is 0 or “” or () or similar
I’m not sure whether I expect, or want, the magic to trigger to throw away the
result and return x instead.
Or, if you meant that if checks the bytecode statically to see if it’s just
returning the result of a load without doing anything in between, what if, say,
x.__getself__() raises?
_______________________________________________
Python-ideas mailing list -- [email protected]
To unsubscribe send an email to [email protected]
https://mail.python.org/mailman3/lists/python-ideas.python.org/
Message archived at
https://mail.python.org/archives/list/[email protected]/message/2GNHV5XGKIFN4YGNM33BBQHLD33J7MHN/
Code of Conduct: http://python.org/psf/codeofconduct/