[issue30560] Py_SetFatalErrorAbortFunc: Allow embedding program to handle fatal errors
New submission from Thomas Perl: In our application that embeds a Python interpreter, we had the issue that certain system configurations (namely, a lone "libpython27.dll" without any standard library) caused silent failures (note that this is using Python 2.7, but Python 3.x would be similarly affected, as the Py_FatalError() handling is still the same): https://github.com/gpodder/gpodder/issues/286 There exists a Stack Overflow thread about this: https://stackoverflow.com/questions/7688374/how-to-i-catch-and-handle-a-fatal-error-when-py-initialize-fails The workaround described therein is: "I solved this by creating a separate executable that attempts to initialize python. My primary process will launch it and check the exit code and only call PyInitialize if the child process was successful. So, python is initialized twice, but it is better than an apparent crash to the user." So, what if instead we allow the embedding program to set a function pointer to a function that will get called instead of abort()? We have to make clear in the docs that after this function is called, the Python interpreter cannot be used, and the application should probably exit, but at least it would allow applications to capture the error message and show it to the user (e.g. using a MessageBox() on Windows) before exiting -- see attached patch. Any alternative solutions would be fine as well that don't require us to create a new process and "try to" Py_Initialize() there, especially since Py_FatalError() might potentially be called in other places and in all those cases, a user-visible dialog would be better than an apparent "crash" (application exits). -- components: Interpreter Core files: Py_SetFatalErrorAbortFunc.patch keywords: patch messages: 295098 nosy: thomas.perl priority: normal severity: normal status: open title: Py_SetFatalErrorAbortFunc: Allow embedding program to handle fatal errors versions: Python 3.6 Added file: http://bugs.python.org/file46921/Py_SetFatalErrorAbortFunc.patch ___ Python tracker <http://bugs.python.org/issue30560> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue30560] Py_SetFatalErrorAbortFunc: Allow embedding program to handle fatal errors
Thomas Perl added the comment: Quick fix for the patch: Of course, the line with abort() needs to be removed before this block: +if (_fatal_error_abort_func != NULL) { +_fatal_error_abort_func(msg); +} else { +abort(); +} -- ___ Python tracker <http://bugs.python.org/issue30560> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue36452] Detect dict iteration "overflow" when changing keys
New submission from Thomas Perl : Using: Python 3.8 (git commit ID: d5a5a33f12b60129d57f9b423b77d2fcba506834), the following code snippet: = a = {0: 0} for i in a: del a[i] a[i+1] = 0 print(i) = Prints the following output: = 0 1 2 3 4 = The reason for this seems to be the way the internal key list is managed and the "next" value in this list is retrieved. The amount of items seems to be related to USABLE_FRACTION(PyDict_MINSIZE). Since cases where the dictionary size changes are detected with a RuntimeError, I would expect the invariant to be "the number of iterations is the len() of the dict at the time the iterator is created to be enforced. Whether to raise a StopIteration instead or raising a RuntimeError is up for debate. Attached is a patch that tries to detect this corner case and raise a RuntimeError instead (plus a unit test). Note also that without the patch, the __length_hint__() of the iterator actually underflows: = a = {0: 0} it = iter(a) print('Length hint:', it.__length_hint__()) next(it) print('Length hint:', it.__length_hint__()) del a[0] a[1] = 0 next(it) print('Length hint:', it.__length_hint__()) = -- files: 0001-dictiterobject-Track-maximum-iteration-count-via-di-.patch keywords: patch messages: 338997 nosy: thomas.perl priority: normal severity: normal status: open title: Detect dict iteration "overflow" when changing keys type: behavior versions: Python 3.8 Added file: https://bugs.python.org/file48234/0001-dictiterobject-Track-maximum-iteration-count-via-di-.patch ___ Python tracker <https://bugs.python.org/issue36452> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue36473] Detect all dictionary changes during iteration
New submission from Thomas Perl : On top of issue 36452, I noticed some other corner cases that are still not handled. For one, the patch (Github PR 12596) only handles iterating over keys, but not iterating over values or items: == a = {0: 0} it = iter(a.values()) print('Length hint:', it.__length_hint__()) print(next(it)) print('Length hint:', it.__length_hint__()) del a[0] a[1] = 99 print(next(it)) print('Length hint:', it.__length_hint__()) == Replace a.values() there with a.items() -- same issue. Note that PR 12596 fixes the a.keys() case (same as iterating over "a" directly). Applying the "di->len == 0" check in dictiter_iternextvalue() and dictiter_iternextitem() would fix those two cases above, but would still not fix the following case: == a = {0: 'a', 1: 'b', 2: 'c'} it = iter(a) i = next(it) print('got first:', i) del a[1] a[1] = 'd' i = next(it) print('got second:', i) i = next(it) print('got third:', i) try: i = next(it) raise RuntimeError(f'got fourth: {i}') except StopIteration: print('stop iteration') == The reason for this is that the iteration count (3 in this case) isn't modified, but the dict's keys are still changed, and the iteration order is as follows: == got first: 0 got second: 2 got third: 1 stop iteration == Note that the key 1 there is first deleted and then set. I'll add a Github PR that tries to solve these corner cases too by tracking dict keys modification. -- components: Interpreter Core messages: 339115 nosy: thomas.perl priority: normal severity: normal status: open title: Detect all dictionary changes during iteration type: behavior versions: Python 3.8 ___ Python tracker <https://bugs.python.org/issue36473> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue36473] Detect all dictionary changes during iteration
Change by Thomas Perl : -- keywords: +patch pull_requests: +12554 stage: -> patch review ___ Python tracker <https://bugs.python.org/issue36473> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue36452] Detect dict iteration "overflow" when changing keys
Change by Thomas Perl : -- pull_requests: +12555 ___ Python tracker <https://bugs.python.org/issue36452> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue36473] dictkeysobject: Add maximum iteration check for .values() and .items()
Thomas Perl added the comment: Repurposing this as per: https://github.com/python/cpython/pull/12619#issuecomment-478076996 -- title: Detect all dictionary changes during iteration -> dictkeysobject: Add maximum iteration check for .values() and .items() ___ Python tracker <https://bugs.python.org/issue36473> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue27490] ARM cross-compile: pgen built without $(CFLAGS) as $(LIBRARY) dependency
New submission from Thomas Perl: Problem description: Trying to cross-compile $(LIBRARY) (libpython2.7.a for example) causes "pgen" to be built, even when it's not used in the cross-compilation case (only a file copy is done to generate $(GRAMMAR_H) and $(GRAMMAR_C)). The current rule for $(PGEN) in Makefile.pre.in does not include $(CFLAGS): https://hg.python.org/cpython/file/tip/Makefile.pre.in#l810 This causes problems when $(CFLAGS) changes the ARM float ABI, e.g.: CFLAGS="-mfloat-abi=hard" This causes the following issues at link time: 1. The .o files that get linked into "pgen" are built with CFLAGS (which is good, because some of them are used for libpython as well) 2. When the "pgen" binary gets built, the $(CFLAGS) are not used 3. Compiler fails to build "pgen" with different float ABI settings = [...] arm-none-eabi-gcc -c -fno-strict-aliasing -march=armv6k -mtune=mpcore -mfloat-abi=hard -mtp=soft -fomit-frame-pointer -ffunction-sections -DARM11 -D_3DS -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes -I. -IInclude -I./Include -DPy_BUILD_CORE -o Parser/pgenmain.o Parser/pgenmain.c arm-none-eabi-gcc -DNDEBUG -g -fwrapv -O3 -Wall -Wstrict-prototypes Parser/acceler.o Parser/grammar1.o Parser/listnode.o Parser/node.o Parser/parser.o Parser/parsetok.o Parser/bitset.o Parser/metagrammar.o Parser/firstsets.o Parser/grammar.o Parser/pgen.o Objects/obmalloc.o Python/mysnprintf.o Python/pyctype.o Parser/tokenizer_pgen.o Parser/printgrammar.o Parser/pgenmain.o -o Parser/pgen /Users/thp/pkg/devkitPro/devkitARM/bin/../lib/gcc/arm-none-eabi/5.3.0/../../../../arm-none-eabi/bin/ld: error: Parser/acceler.o uses VFP register arguments, Parser/pgen does not /Users/thp/pkg/devkitPro/devkitARM/bin/../lib/gcc/arm-none-eabi/5.3.0/../../../../arm-none-eabi/bin/ld: failed to merge target specific data of file Parser/acceler.o [...] = Note that the error message is repeated for all .o files linked into pgen, I've only included one here for demonstration purposes. The following patch (against a Python 2.7.12 tarball, similar fix for Hg tip and Python 3) fixes the issue for me: = diff -u Python-2.7.12/Makefile.pre.in Python-2.7.12-fix/Makefile.pre.in --- Python-2.7.12/Makefile.pre.in 2016-06-25 23:49:31.0 +0200 +++ Python-2.7.12-fix/Makefile.pre.in 2016-07-12 00:17:02.0 +0200 @@ -698,7 +698,7 @@ fi $(PGEN): $(PGENOBJS) - $(CC) $(OPT) $(LDFLAGS) $(PGENOBJS) $(LIBS) -o $(PGEN) + $(CC) $(OPT) $(CFLAGS) $(LDFLAGS) $(PGENOBJS) $(LIBS) -o $(PGEN) Parser/grammar.o: $(srcdir)/Parser/grammar.c \ $(srcdir)/Include/token.h \ = Also note that the same $(CFLAGS) needs to be added to the rule for $(BUILDPYTHON) if one wants to build that as well, but in my case, I only did a "make libpython2.7.a", and that indirectly depends on pgen ($(LIBRARY) -> $(LIBRARY_OBJS) -> $(PYTHON_OBJS) -> Python/graminit.o -> $(GRAMMAR_C) -> $(GRAMMAR_H) -> $(PGEN), which results in that error message, so libpython2.7.a can't be built). Another fix could be to make it so that $(GRAMMAR_H) does not depend on $(PGEN) if $(cross_compiling) is "yes" (if you read the rule contents for $(GRAMMAR_H), you'll find that indeed $(PGEN) isn't used at all if $(cross_compiling) is "yes". At least for GNU make, it might be possible to avoid building "pgen" in that case as follows and removing $(PGEN) from the default dependencies of $(GRAMMAR_H): ifneq ($(cross_compiling),yes) $(GRAMMAR_H): $(PGEN) endif If this is a more acceptable solution, one could probably rewrite the "test "$(cross_compiling" != "yes"; then..." part of the make rules from $(GRAMMAR_H) and $(GRAMMAR_C) with Make's ifeq, here's a patch for that instead (this also makes the dependencies more clear, since $(GRAMMAR_H) does not depend on $(GRAMMAR_INPUT) for the cross-input case, as it is not used): = diff -u Python-2.7.12/Makefile.pre.in Python-2.7.12-fix/Makefile.pre.in --- Python-2.7.12/Makefile.pre.in 2016-06-25 23:49:31.0 +0200 +++ Python-2.7.12-fix/Makefile.pre.in 2016-07-12 00:37:43.0 +0200 @@ -680,22 +680,21 @@ Modules/pwdmodule.o: $(srcdir)/Modules/pwdmodule.c $(srcdir)/Modules/posixmodule.h +ifeq ($(cross_compiling),yes) +$(GRAMMAR_H): $(srcdir)/Include/graminit.h + @$(MKDIR_P) Include + cp $(srcdir)/Include/graminit.h $(GRAMMAR_H).tmp + mv $(GRAMMAR_H).tmp $(GRAMMAR_H) +$(GRAMMAR_C): $(srcdir)/Python/graminit.c + cp $(srcdir)/Python/graminit.c $(GRAMMAR_C).tmp + mv $(GRAMMAR_C).tmp $(GRAMMAR_C) +else $(GRAMMAR_H): $(GRAMMAR_INPUT) $(PGEN) @$(MKDIR_P) Include - # Avoid copying the file onto itself for an in-tree build - if test "$(cross_
[issue27490] ARM cross-compile: pgen built without $(CFLAGS) as $(LIBRARY) dependency
Thomas Perl added the comment: Also related: https://bugs.python.org/issue22359 -- ___ Python tracker <http://bugs.python.org/issue27490> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue27490] ARM cross-compile: pgen built without $(CFLAGS) as $(LIBRARY) dependency
Thomas Perl added the comment: Yes setting "CC" would fix the problem, but i guess the CFLAGS issue was just the original symptom (and my first reaction to fixing it), whereas the underlying problem is that pgen gets built in cases where it shouldn't be built at all. The solution with "ifeq" does seem to require GNU make, and since these conditionals would still appear in the Makefile even for non-cross-builds, we can't really use this if compatibility with non-GNU make is a requirement. Based on http://gallium.inria.fr/blog/portable-conditionals-in-makefiles/, here is something that seems to work ($(PGEN_DEPS) will be $(PGEN) when cross_compiling=no, and empty when cross_compiling=yes, and configure.ac errors out if cross_compiling is "maybe", so we do not need to handle that case at the moment): = diff -ru Python-2.7.12/Makefile.pre.in Python-2.7.12-fix/Makefile.pre.in --- Python-2.7.12/Makefile.pre.in 2016-06-25 23:49:31.0 +0200 +++ Python-2.7.12-fix/Makefile.pre.in 2016-07-13 12:21:27.0 +0200 @@ -246,6 +246,8 @@ ## # Parser PGEN= Parser/pgen$(EXE) +PGEN_DEPS0= ${cross_compiling:yes=} +PGEN_DEPS= ${PGEN_DEPS0:no=$(PGEN)} PSRCS= \ Parser/acceler.c \ @@ -680,7 +682,7 @@ Modules/pwdmodule.o: $(srcdir)/Modules/pwdmodule.c $(srcdir)/Modules/posixmodule.h -$(GRAMMAR_H): $(GRAMMAR_INPUT) $(PGEN) +$(GRAMMAR_H): $(GRAMMAR_INPUT) $(PGEN_DEPS) @$(MKDIR_P) Include # Avoid copying the file onto itself for an in-tree build if test "$(cross_compiling)" != "yes"; then \ = -- ___ Python tracker <http://bugs.python.org/issue27490> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue27490] ARM cross-compile: pgen built without $(CFLAGS) as $(LIBRARY) dependency
Thomas Perl added the comment: Adding "-mfloat-abi=hard" to LDFLAGS fixes the issue for me, and also allows $(BUILDPYTHON) to be built correctly in addition to $(PGEN). So I guess the resolution to this issue is "works for me" (with setting CC or LDFLAGS properly for cross-compilation being the resolution/workaround). Does it make sense to create a new bug "Do not build pgen when it's not going to be used" as follow-up to this discussion (with a patch similar to the one in http://bugs.python.org/msg270304)? If so, I'll create one. Or maybe there should be a generic configure flag "do not run any generators" (like https://bugs.python.org/issue26662#msg270162, but including not running pgen), with the flag only issuing warnings instead of failing. -- ___ Python tracker <http://bugs.python.org/issue27490> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue27490] Do not run pgen when it is not going to be used (cross-compiling)
Thomas Perl added the comment: Repurposing this bug as "do not run pgen". Documenting and using 'make PGEN_DEP=""' might also work; however, given that the configure script uses autoconf, and there's also code in place for PYTHON_FOR_BUILD, I've attached a patch that makes the PGEN dependency just a autoconf substitution -- this might make it clear and not depend on any make substitution features? Patch attached against current Hg cpython default branch tip, a similar patch also applies against 2.7. -- keywords: +patch title: ARM cross-compile: pgen built without $(CFLAGS) as $(LIBRARY) dependency -> Do not run pgen when it is not going to be used (cross-compiling) Added file: http://bugs.python.org/file43830/pgen_dependencies.patch ___ Python tracker <http://bugs.python.org/issue27490> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue27641] Do not build Programs/_freeze_importlib when cross-compiling
New submission from Thomas Perl: Based on http://bugs.python.org/issue27490 and http://bugs.python.org/msg271495, here is a patch that makes sure Programs/_freeze_importlib is only built when not cross-compiling. -- components: Cross-Build files: python-freeze-importlib-cross-compiling.patch keywords: patch messages: 271519 nosy: Alex.Willmer, Thomas Perl, martin.panter priority: normal severity: normal status: open title: Do not build Programs/_freeze_importlib when cross-compiling type: behavior versions: Python 2.7, Python 3.6 Added file: http://bugs.python.org/file43920/python-freeze-importlib-cross-compiling.patch ___ Python tracker <http://bugs.python.org/issue27641> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com
[issue27641] Do not build Programs/_freeze_importlib when cross-compiling
Thomas Perl added the comment: +1 on comment-out-regen.patch, makes things much cleaner and removes the shell "if" in the rule body. Just a small bikeshed issue: Instead of COMMENT_REGEN, maybe call it "CROSS_COMPILE_COMMENT" or "GENERATED_COMMENT" or "COMMENT_IF_CROSS" or somesuch? This way, might be easier to read/understand the makefile rules ("COMMENT_IF_CROSS" -> "a comment character will be inserted here if cross-compiling")? -- ___ Python tracker <http://bugs.python.org/issue27641> ___ ___ Python-bugs-list mailing list Unsubscribe: https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com