Jeroen Demeyer added the comment:
> Is there any benchmark showing if it's faster
Here is one example:
class D(dict):
def __missing__(self, key):
return None
d = D()
and now benchmark d[0]
**before**: Mean +- std dev: 173 ns +- 1 ns
**after**: Mean +- std dev: 162 ns
Jeroen Demeyer added the comment:
Stefan: I used an underscore by analogy with
PyObject_CallNoArgs()/_PyObject_CallNoArg(), where the first is in the limited
API and the second is an inline function in the cpython API.
But maybe we could revisit that decision
Jeroen Demeyer added the comment:
Victor, what's your opinion on adding PyObject_CallOneArg() to the limited API?
--
___
Python tracker
<https://bugs.python.org/is
New submission from Jeroen Demeyer :
Try to use _PyObject_CallNoArg in all places where a function is called without
arguments.
--
components: Interpreter Core
messages: 347230
nosy: jdemeyer
priority: normal
severity: normal
status: open
title: Use _PyObject_CallNoArg() in a few more
Change by Jeroen Demeyer :
--
keywords: +patch
pull_requests: +14393
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/14575
___
Python tracker
<https://bugs.python.org/issu
Jeroen Demeyer added the comment:
Test of stack usage:
from _testcapi import stack_pointer
class D(dict):
def __missing__(self, key):
sp = stack_pointer()
print(f"stack usage = {TOP - sp}")
return None
d = D()
TOP = stack_pointer()
d[0]
**before**: s
Change by Jeroen Demeyer :
--
pull_requests: +14394
pull_request: https://github.com/python/cpython/pull/14575
___
Python tracker
<https://bugs.python.org/issue37
Jeroen Demeyer added the comment:
For the benefit of PR 37207, I would like to re-open this discussion. It may
have been rejected for the wrong reasons. Victor's patch was quite inefficient,
but that's to be expected: msg285744 mentions a two-step process, but during
the disc
Change by Jeroen Demeyer :
--
resolution: -> fixed
stage: patch review -> resolved
status: open -> closed
___
Python tracker
<https://bugs.python.or
Change by Jeroen Demeyer :
--
pull_requests: +14405
pull_request: https://github.com/python/cpython/pull/11636
___
Python tracker
<https://bugs.python.org/issue22
Jeroen Demeyer added the comment:
> How can we avoid unpacking dict in case of d1.update(**d2)?
We cannot. However, how common is that call? One could argue that we should
optimize for the more common case of d1.update(d2).
--
___
Python trac
Jeroen Demeyer added the comment:
Above, I meant #37207 or PR 13930.
--
___
Python tracker
<https://bugs.python.org/issue29312>
___
___
Python-bugs-list mailin
Jeroen Demeyer added the comment:
> How can we avoid unpacking dict in case of d1.update(**d2)?
The unpacking is only a problem if you insist on using PyDict_Merge(). It would
be perfectly possible to implement dict merging from a tuple+vector instead of
from a dict. In that case, th
Jeroen Demeyer added the comment:
You are correct that PyDict_Merge() does not need to recompute the hashes of
the keys. However, your example doesn't work because you need string keys for
**kwargs. The "str" class caches its hash, so you would need a dict with a
"str&
Change by Jeroen Demeyer :
--
pull_requests: +14406
pull_request: https://github.com/python/cpython/pull/14588
___
Python tracker
<https://bugs.python.org/issue37
Change by Jeroen Demeyer :
--
pull_requests: +14407
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/14589
___
Python tracker
<https://bugs.python.org/issu
Change by Jeroen Demeyer :
--
pull_requests: +14415
pull_request: https://github.com/python/cpython/pull/14600
___
Python tracker
<https://bugs.python.org/issue37
Jeroen Demeyer added the comment:
> Jeroen: hum, you both proposed a similar fix :-)
It seems that I lost the race ;-)
But seriously: if we both independently came up with the same solution, that's
a good sign that the solution is
Jeroen Demeyer added the comment:
One thing that keeps bothering me when using vectorcall for type.__call__ is
that we would have two completely independent code paths for constructing an
object: the new one using vectorcall and the old one using tp_call, which in
turn calls tp_new and
Jeroen Demeyer added the comment:
Any objections to closing this?
--
___
Python tracker
<https://bugs.python.org/issue36974>
___
___
Python-bugs-list mailin
Change by Jeroen Demeyer :
--
pull_requests: +14418
pull_request: https://github.com/python/cpython/pull/14603
___
Python tracker
<https://bugs.python.org/issue37
Jeroen Demeyer added the comment:
> d2 must be converted to 2 lists (kwnames and args) and then a new dict should
> be created.
The last part is not necessarily true. You could do the update directly,
without having that intermediat
Jeroen Demeyer added the comment:
> but it will make d1.update(**d2) slower with a complexity of O(n): d2 must be
> converted to 2 lists
This part is still true and it causes a slow-down of about 23% for
dict.update(**d), see benchmarks at
https://github.com/python/cpython/pull
New submission from Jeroen Demeyer :
Keyword names in calls are expected to be strings, however it's currently not
clear who should enforce/check this.
I suggest to fix this for vectorcall/METH_FASTCALL and specify that it's the
caller's job to make sure that keyword names a
Change by Jeroen Demeyer :
--
keywords: +patch
pull_requests: +14487
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/14682
___
Python tracker
<https://bugs.python.org/issu
Change by Jeroen Demeyer :
--
resolution: -> fixed
stage: patch review -> resolved
status: open -> closed
___
Python tracker
<https://bugs.python.or
Change by Jeroen Demeyer :
--
resolution: -> fixed
stage: patch review -> resolved
status: open -> closed
___
Python tracker
<https://bugs.python.or
Change by Jeroen Demeyer :
--
pull_requests: +14488
pull_request: https://github.com/python/cpython/pull/14683
___
Python tracker
<https://bugs.python.org/issue29
Change by Jeroen Demeyer :
--
pull_requests: +14490
pull_request: https://github.com/python/cpython/pull/14684
___
Python tracker
<https://bugs.python.org/issue37
New submission from Jeroen Demeyer :
We already have
_PyObject_CallNoArg()
_PyObject_CallOneArg()
_PyObject_CallMethodNoArgs()
so it makes sense to also add
_PyObject_CallMethodOneArg()
--
components: Interpreter Core
messages: 347619
nosy: inada.naoki, jdemeyer, vstinner
priority
Change by Jeroen Demeyer :
--
keywords: +patch
pull_requests: +14492
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/14685
___
Python tracker
<https://bugs.python.org/issu
Change by Jeroen Demeyer :
--
resolution: -> fixed
stage: patch review -> resolved
status: open -> closed
___
Python tracker
<https://bugs.python.or
Jeroen Demeyer added the comment:
I understand the arguments for not removing these functions. However, I still
think that we should deprecate them but without planning in advance when they
should be removed. Victor said that we should document these functions as
"please don'
Jeroen Demeyer added the comment:
Could you please specify:
- which commits are you comparing exactly? From your explanation, I guess
aacc77fbd and its parent, but that's not completely fair since PEP 590 consists
of many commits (see #36974). A better comparison would be master ag
Jeroen Demeyer added the comment:
I will certainly have a look and try a few things, but it will probably be next
week.
--
___
Python tracker
<https://bugs.python.org/issue37
Jeroen Demeyer added the comment:
See also
https://github.com/python/cpython/pull/14193#pullrequestreview-251630953
--
nosy: +jdemeyer
___
Python tracker
<https://bugs.python.org/issue37
Jeroen Demeyer added the comment:
I did some benchmarks WITHOUT PGO (simply because it's much faster to compile
and therefore easier to test things out).
The command I used for testing is
./python -m perf timeit --duplicate 200 -s 'f = len; x
Change by Jeroen Demeyer :
--
pull_requests: +14578
pull_request: https://github.com/python/cpython/pull/14782
___
Python tracker
<https://bugs.python.org/issue36
Change by Jeroen Demeyer :
--
pull_requests: +14579
pull_request: https://github.com/python/cpython/pull/14782
___
Python tracker
<https://bugs.python.org/issue37
Change by Jeroen Demeyer :
--
versions: -Python 3.9
___
Python tracker
<https://bugs.python.org/issue37562>
___
___
Python-bugs-list mailing list
Unsubscribe:
Jeroen Demeyer added the comment:
PR 14782 (backport of PR 13781) fixes the regression for me.
--
___
Python tracker
<https://bugs.python.org/issue37
Change by Jeroen Demeyer :
--
keywords: +patch
pull_requests: +14589
stage: needs patch -> patch review
pull_request: https://github.com/python/cpython/pull/14795
___
Python tracker
<https://bugs.python.org/issu
Change by Jeroen Demeyer :
--
pull_requests: +14600
pull_request: https://github.com/python/cpython/pull/14804
___
Python tracker
<https://bugs.python.org/issue29
Jeroen Demeyer added the comment:
One possible solution would be to have a macro to suppress the tp_print field
in the first place. Something like
#ifndef PY_NO_TP_PRINT
/* bpo-37250: kept for backwards compatibility in CPython 3.8 only */
Py_DEPRECATED(3.8) int (*tp_print)(PyObject
New submission from Jeroen Demeyer :
>>> class S(str):
... __eq__ = int.__eq__
>>> S() == S()
True
The expectation is that this raises an exception because int.__eq__() is called
on S instances.
--
components: Interpreter Core
messages: 348108
nosy: jdemeye
Change by Jeroen Demeyer :
--
keywords: +patch
pull_requests: +14627
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/14836
___
Python tracker
<https://bugs.python.org/issu
Jeroen Demeyer added the comment:
> We have some reserved/deprecated/unused fields. Setting 0 to them makes
> forward incompatible code.
Good point. tp_print is removed in 3.9
--
___
Python tracker
<https://bugs.python.org/i
Jeroen Demeyer added the comment:
I support the patch proposed in https://bugs.python.org/file48478/pyport.h.diff
but it's not up to me to make that decision.
--
___
Python tracker
<https://bugs.python.org/is
Change by Jeroen Demeyer :
--
keywords: +patch
pull_requests: +14652
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/14863
___
Python tracker
<https://bugs.python.org/issu
New submission from Jeroen Demeyer :
PyEval_GetFuncName is bad API because
1. It hardcodes a limited number of function classes (which doesn't even
include all function classes in the core interpreter) instead of supporting
duck-typing.
2. In case of a "function" object,
Jeroen Demeyer added the comment:
4. It uses the __name__ instead of the __qualname__
--
___
Python tracker
<https://bugs.python.org/issue37645>
___
___
Pytho
Change by Jeroen Demeyer :
--
keywords: +patch
pull_requests: +14673
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/14890
___
Python tracker
<https://bugs.python.org/issu
Jeroen Demeyer added the comment:
> If we want to support other numerical types with loss in double rounding, the
> most reliable way is to represent them as fractions (x.as_integer_ratio() or
> (x.numerator, x.denominator))
See
https://discuss.python.org/t/pep-3141-ratio-i
Jeroen Demeyer added the comment:
Another solution would be to change the __str__ of various function objects to
a prettier output. For example, we currently have
>>> def f(): pass
>>> print(f)
We could change this to
>>> def f(): pass
>>> print(f)
f
Jeroen Demeyer added the comment:
See
https://discuss.python.org/t/pep-3141-ratio-instead-of-numerator-denominator/2037/24?u=jdemeyer
for a proposal to define a new dunder __ratio__ (instead of as_integer_ratio)
for this.
--
nosy: +jdemeyer
Jeroen Demeyer added the comment:
Please close
--
___
Python tracker
<https://bugs.python.org/issue37562>
___
___
Python-bugs-list mailing list
Unsubscribe:
Jeroen Demeyer added the comment:
> Should we add a note like "if you get a 'SystemError: bad call flags' on
> import, check the descriptor flags of your functions" in What's New in Python
> 3.8?
A better solution would be to change the error message. We
Jeroen Demeyer added the comment:
I agree with rejecting and closing this issue.
--
nosy: +jdemeyer
___
Python tracker
<https://bugs.python.org/issue33
New submission from Jeroen Demeyer :
Take the LIKELY/UNLIKELY macros out of Objects/obmalloc.c (renaming them of
course). Use them in a few places to micro-optimize vectorcall.
--
components: Interpreter Core
messages: 349108
nosy: Mark.Shannon, inada.naoki, jdemeyer
priority: normal
Change by Jeroen Demeyer :
--
keywords: +patch
pull_requests: +14881
stage: -> patch review
pull_request: https://github.com/python/cpython/pull/15144
___
Python tracker
<https://bugs.python.org/issu
Jeroen Demeyer added the comment:
Another idea I had is to somehow deal with this in PyErr_WriteUnraisable:
whenever PyErr_WriteUnraisable is called for a KeyboardInterrupt, defer that
exception to a later time, for example when _PyEval_EvalFrameDefault() is
called
Jeroen Demeyer added the comment:
These functions are now officially deprecated, see PR 14804. So I think that
this issue can be closed.
--
nosy: +jdemeyer
___
Python tracker
<https://bugs.python.org/issue11
Jeroen Demeyer added the comment:
Aren't you worried about using the non-special non-reserved attributes like
"as_integer_ratio"? That's the reason why I proposed a dunder name "__ratio__"
instead of "as_integer_ratio".
In my opinion, it was a mista
Changes by Jeroen Demeyer :
--
nosy: +jdemeyer
___
Python tracker
<http://bugs.python.org/issue29988>
___
___
Python-bugs-list mailing list
Unsubscribe:
Jeroen Demeyer added the comment:
Nice analysis. I always assumed that `with` was safe from such race conditions,
but it isn't.
--
___
Python tracker
<http://bugs.python.org/is
Jeroen Demeyer added the comment:
> Or we could steal a bit in the opcode encoding or something.
That seems like a very reasonable and easy-to-implement solution. It would
generalize this check:
https://github.com/python/cpython/blob/e82cf8675bacd7a03de508ed11865fc2701dcef5/Python/ceva
Jeroen Demeyer added the comment:
> It seems like that does at least try to guarantee that a signal can't
> interrupt between:
>
> lock.acquire()
> try:
> ...
Actually, I think it's between the end of the `try` and the beginning of the
`finally` (which is p
Jeroen Demeyer added the comment:
> Actually, I think it's between the end of the `try` and the beginning of the
> `finally` (which is precisely the same place that *breaks* for a with
> statement).
Sorry, this is wrong and Erik was right. But the special case doesn't
New submission from Jeroen Demeyer :
This used to work correctly in Python 2:
class Half(object):
def __float__(self):
return 0.5
import time
time.sleep(Half())
With Python 3.6, one gets instead
Traceback (most recent call last):
File "test.py", line 6, in
time.
Jeroen Demeyer added the comment:
> I'm not sure in which order the conversion should be tried to avoid/reduce
> precision loss during the conversion.
I would suggest the order
1. __index__ to ensure exact conversion of exact integers
2. __float__ to ensure correct conversion
Jeroen Demeyer added the comment:
> the most reliable way is to represent them as fractions (x.as_integer_ratio()
> or (x.numerator, x.denominator))
I don't think that we can rely on non-dunder names like that. They are not
reserved names, so classes can give them any semantic
Jeroen Demeyer added the comment:
> The correct code works for float and int (and maybe decimal.Decimal, I don't
> recall!)
Not for Decimal! In fact sleep(Decimal("0.99")) is interpreted as sleep(0)
because __int
Jeroen Demeyer added the comment:
> In Jeroen's API, I can see what the Python-level signal handler is, but
> there's no way to find out whether that signal handler is actually in use or
> not.
I added support for that in the latest cysignals release. Now you can do
Jeroen Demeyer added the comment:
For reference, the sources for my implementation:
https://github.com/sagemath/cysignals/blob/master/src/cysignals/pysignals.pyx
--
___
Python tracker
<https://bugs.python.org/issue13
Jeroen Demeyer added the comment:
My proposal vastly improves the situation for Decimal. I will write a PR for
this and I hope that it won't be rejected just because it's not perfect.
--
___
Python tracker
<https://bugs.python.o
Jeroen Demeyer added the comment:
I guess I should wait until PR 11507 is merged, to avoid merge conflicts.
--
___
Python tracker
<https://bugs.python.org/issue35
Jeroen Demeyer added the comment:
To avoid code duplication, it's tempting to merge _PyTime_FromObject and
_PyTime_ObjectToDenominator
These two functions almost do the same, but not quite.
--
___
Python tracker
<https://bugs.py
Jeroen Demeyer added the comment:
The motivation for PEP 357 was certainly using an object as the index for a
sequence, but that's not the only use case.
In fact PEP 357 states "For example, the slot can be used any time Python
requires an integer internally"
So despite the
New submission from Jeroen Demeyer :
This test was recently added (PR 6332):
def test_no_such_executable(self):
no_such_executable = 'no_such_executable'
try:
pid = posix.posix_spawn(no_such_executable,
[no_such_
Jeroen Demeyer added the comment:
It's a relatively old Gentoo GNU/Linux system:
Linux tamiyo 3.17.7-gentoo #2 SMP PREEMPT Fri Dec 23 18:13:49 CET 2016 x86_64
Intel(R) Core(TM) i7-2640M CPU @ 2.80GHz GenuineIntel GNU/Linux
The problem occurs when there are directories on $PATH which ar
Jeroen Demeyer added the comment:
If __index__ doesn't "feel" right, what do you propose then to fix this issue,
keeping in mind the concerns of https://bugs.python.org/issue35707#msg333401
--
___
Python tracker
<https
Jeroen Demeyer added the comment:
In other words: if we can only use __float__ and __int__, how do we know which
one to use?
--
___
Python tracker
<https://bugs.python.org/issue35
Jeroen Demeyer added the comment:
> If we want to support other numerical types with loss in double rounding
Looking at the existing code, I can already see several double-rounding "bugs"
in the code, so I wouldn't be too m
Change by Jeroen Demeyer :
--
pull_requests: +11407
___
Python tracker
<https://bugs.python.org/issue35707>
___
___
Python-bugs-list mailing list
Unsubscribe:
Change by Jeroen Demeyer :
--
pull_requests: +11407, 11408
___
Python tracker
<https://bugs.python.org/issue35707>
___
___
Python-bugs-list mailing list
Unsub
Change by Jeroen Demeyer :
--
pull_requests: +11407, 11408, 11409
___
Python tracker
<https://bugs.python.org/issue35707>
___
___
Python-bugs-list mailin
Jeroen Demeyer added the comment:
> Test with os.posix_spawn() is fine:
Indeed, the difference between posix_spawn() and posix_spawnp() is that only
the latter uses $PATH to look up the executable.
--
___
Python tracker
<https://bugs.pyth
Jeroen Demeyer added the comment:
There is again some discussion about this at
https://discuss.python.org/t/why-are-some-expressions-syntax-errors/420
--
___
Python tracker
<https://bugs.python.org/issue19
Change by Jeroen Demeyer :
--
components: +Interpreter Core
___
Python tracker
<https://bugs.python.org/issue35707>
___
___
Python-bugs-list mailing list
Unsub
Jeroen Demeyer added the comment:
> I'm also mildly concerned by how duplicative the code becomes post-patch.
I know, that's why I added that comment on GitHub.
> perhaps just implement _PyTime_ObjectToTime_t as a wrapper for
> _PyTime_ObjectToDenominator
Sure, but wil
Jeroen Demeyer added the comment:
> You've got a reference leak in your __index__ based paths.
Thanks for pointing that out. I fixed that now.
--
___
Python tracker
<https://bugs.python.org
Jeroen Demeyer added the comment:
> it seems that Jeroen's analysis is right.
So would you be willing to merge the PR then?
--
___
Python tracker
<https://bugs.python.org
Jeroen Demeyer added the comment:
> Fixing this on 2.7 would require additional investigation (distutils might
> have diverged)
Let's be honest, we are talking about distutils here. So it's way more likely
that it didn't diverge and that the behavior is exactly the sam
Jeroen Demeyer added the comment:
> Could you still give it a quick check?
I did just that. For reference, these are the steps:
- Checkout the "2.7" branch of the cpython git repo
- ./configure --prefix=/tmp/prefix --exec-prefix=/tmp/eprefix && make && make
Jeroen Demeyer added the comment:
(note typo in the above: /tmp/prefix/pip should be /tmp/prefix/bin/pip)
--
___
Python tracker
<https://bugs.python.org/issue25
New submission from Jeroen Demeyer :
When designing an extension type subclassing an existing type, it makes sense
to call the tp_dealloc of the base class from the tp_dealloc of the subclass.
Now suppose that I'm subclassing "list" which uses the trashcan mechanism. Then
it
Jeroen Demeyer added the comment:
The problem is easily reproduced with Cython:
cdef class List(list):
cdef int deallocated
def __dealloc__(self):
if self.deallocated:
print("Deallocated twice!")
self.deallocated = 1
L = None
for i i
Change by Jeroen Demeyer :
--
keywords: +patch
pull_requests: +11871
stage: test needed -> patch review
___
Python tracker
<https://bugs.python.org/issu
Jeroen Demeyer added the comment:
NOTE: also OrderedDict currently uses trashcan hacking to work around this
problem:
/* Call the base tp_dealloc(). Since it too uses the trashcan mechanism,
* temporarily decrement trash_delete_nesting to prevent triggering it
* and putting
Jeroen Demeyer added the comment:
See also https://bugs.python.org/issue35983 for another trashcan-related issue.
--
nosy: +jdemeyer
___
Python tracker
<https://bugs.python.org/issue17
201 - 300 of 691 matches
Mail list logo