[issue2521] ABC caches should use weak refs

2011-05-13 Thread H.

H.  added the comment:

ImportError: No module named _weakrefset
Here are some references while i was trying to install Pinax framework.

http://paste.pound-python.org/show/6536/

And i saw that the _weakrefset.py is not included in the package. So I have 
copied from Python's source with version : 3.1.* to my d:\sosyal\ folder. and 
everything works fine.

--
nosy: +bluag

___
Python tracker 
<http://bugs.python.org/issue2521>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46333] ForwardRef.__eq__ does not respect module parameter

2022-01-10 Thread Andreas H.


New submission from Andreas H. :

The __eq__ method of ForwardRef does not take into account the module 
parameter. 

However, ForwardRefs with dissimilar module parameters are referring to 
different types even if they have different name. Thus also the ForwardRef's 
with same name but different module, should not be considered equal.


Consider the following code


from typing import *

ZZ = Optional['YY'] 
YY = int

YY = Tuple[Optional[ForwardRef("YY", module=__name__)], int]
print( YY.__args__[0].__args__[0].__forward_module__ )
# this prints None, but should print __main__ (or whatever __name__ contains)


When the first ForwardRef is not created, the program behaves correctly

#ZZ = Optional['YY'] 
YY = int

YY = Tuple[Optional[ForwardRef("YY", module=__name__)], int]
print( YY.__args__[0].__args__[0].__forward_module__ )
# this prints __main__ (or whatever __name__ contains)



The issue is that the line `ZZ = Optional['YY']` creates a cache entry, which 
is re-used instead of the second definition `Optional[ForwardRef("YY", 
module=__name__)]` and thus shadows the different argument of ForwardRef.


This problem could be fixed if the __eq__ method of FowardRef also checks for 
module equality.

i.e. in ForwardRef.__eq__ in typing.py replace 

   return self.__forward_arg__ == other.__forward_arg__

with 

   return self.__forward_arg__ == other.__forward_arg__  and  
self.__forward__module__ == other.__forward__module__ 


Ideally, (and for consistency reasons) the `__repr__` method of `ForwardRef` 
would also include the module arg if it present:

Change:

def __repr__(self):
return f'ForwardRef({self.__forward_arg__!r})'

to 

def __repr__(self):
if self.__forward_module__ is None:
return f'ForwardRef({self.__forward_arg__!r})'
else:
return f'ForwardRef({self.__forward_arg__!r}, 
module={self.__forward__module!r})'

--
components: Library (Lib)
messages: 410221
nosy: andreash, gvanrossum, kj
priority: normal
severity: normal
status: open
title: ForwardRef.__eq__ does not respect module parameter
type: behavior
versions: Python 3.10, Python 3.9

___
Python tracker 
<https://bugs.python.org/issue46333>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46333] ForwardRef.__eq__ does not respect module parameter

2022-01-10 Thread Andreas H.


Andreas H.  added the comment:

I will give it a try.

--

___
Python tracker 
<https://bugs.python.org/issue46333>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46333] ForwardRef.__eq__ does not respect module parameter

2022-01-12 Thread Andreas H.


Andreas H.  added the comment:

Yeah, sure. The use-case is (de)serialization. Right now I use the library 
cattr, but there are many others. 

If you are interested there is related discussion in the cattr board [1].


The original problem is how to define the types for serialization.

1. If everything is a class, things work well, but not if type aliases are 
used: 

2. Type aliases sometimes have to be used - they cannot be avoided in all 
cases, 
   especially with recursive types. The famous example is 

  Json = Union[ List['Json'], Dict[str, 'Json'], int, float, bool, None ]

   (Note: even though mypy does not support this construct, pylance meanwhile 
does [2])

3. `typing.Annotated` seems to be made for specifying additional information 
such as value ranges, right to be used
   in (de)serialization + validation contexts. Often these will just be type 
aliases (not used as class members). 
   Combination is possible with typing.NewType.


The problem is that the implicit `ForwardRef('Json')` cannot be automatically 
resolved (as it only a name with no context). 
There is really no way this could be handle inside a library such as cattr.  

When one wants to separate interface from implementation this issue is even 
more complicated. The module where the serialization function is called is 
typically different from the module with the type definition (This is probably 
more the norm than the exception)



The option I expored is to explicitly create the ForwardRef and specify the 
module parameter (even though I have to admit that I also did read that the 
ForwardRef is only for internal use). 
   
Json = Union[ List[ForwardRef(Json',module=__name__)], Dict[str, 
ForwardRef(Json',module=__name__)], int, float, bool, None ]

Ugly, but this is better than nothing.


A (worse) alternative is to do

Json = Union[ List['Json'], Dict[str, 'Json'], int, float, bool, None ]
typing._eval_type(Json, globals(), locals())

That works since it puts the ForwardRefs into "evaluated" state 
(self.__forward_value__ is then set)



A third, but future, alternative could be to automatically decompose the 
argument of ForwardRef into module and type. Then one could write

   Json = Union[ List[__name__+'.Json'], Dict[str, __name__+'.Json'], int, 
float, bool, None ]

and all above problems would be solved.

Then ForwardRef could stay internal and this is more readable. Even though, the 
static type checkers would need to understand `__name__+'TYPE'` constructs to 
be useful.


Anyhow, it would be nice to have a solution here.



[1]: https://github.com/python-attrs/cattrs/issues/201
[2]: 
https://devblogs.microsoft.com/python/pylance-introduces-five-new-features-that-enable-type-magic-for-python-developers/

--

___
Python tracker 
<https://bugs.python.org/issue46333>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46333] ForwardRef.__eq__ does not respect module parameter

2022-01-12 Thread Andreas H.


Andreas H.  added the comment:

Ah, let me add one point: PEP563  (-> `from __future__ import annotations`) is 
also not helping. 

Even with PEP563 enabled, the JSON example  

   Json = Union[ List['Json'], Dict[str, 'Json'], int, float, bool, None ]

needs to be written in exact the same way as without PEP563. In other words 
there are cases where `ForwardRef` cannot be avoided. And unforntunately these 
are the cases where we have the ForwardRef missing context issue.

--

___
Python tracker 
<https://bugs.python.org/issue46333>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46369] get_type_hints does not evaluate ForwardRefs inside NewType

2022-01-13 Thread Andreas H.


New submission from Andreas H. :

Consider the following: 

NewT = typing.NewType("NewT", typing.List[typing.Optional['Z']] )

class Z:
pass


Now get_type_hints() does not resolve the ForwardRef within NewType (but it 
does so for TypedDict, dataclasses, NamedTuple).


Neither of the following works.

1)  
class dummy:
test: NewT

get_type_hints(test,None,None)

print( NewT.__supertype__.__args__[0].__args__[0]__.__forward_evaluated__ )
# --> False

Note: investigating the return value of get_type_hints does not change the 
outcome. 
get_type_hints() patches ForwardRefs in-place.


2) 
get_type_hints(NewT,None,None)
# --> TypeError   is not a module, class, method, or function



For Python 3.10+ a workaround exists, but requires access to implementation 
details of NewType:
  
   class dummy:
   test: NewT.__supertype__
   get_type_hints( dummy, globalns=sys.modules[NewT.__module__].__dict__, 
localns=None )


Possible solution could be 
 A) to extent `get_type_hints` to explicitly handle NewType (basically call 
_eval_type for the __supertype__ member).
That makes approach 2) work (but not 1)
 or B) to extend _eval_type() to handle process NewType as well. This would 
make 1) work (but not 2).

I guess, since NewType is supposed to be semantically a subclass of the 
referred type, 2) is probably the preferred approach, which would suggest A). 


Strictly speaking this issue exits in all Python versions that have NewType, 
but it is easier to fix in 3.10 because there NewType has the __module__ member.

--
components: Library (Lib)
messages: 410528
nosy: andreash, gvanrossum, kj
priority: normal
severity: normal
status: open
title: get_type_hints does not evaluate ForwardRefs inside NewType
type: behavior
versions: Python 3.10, Python 3.11

___
Python tracker 
<https://bugs.python.org/issue46369>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46371] A better way to resolve ForwardRefs in type aliases across modules

2022-01-13 Thread Andreas H.


New submission from Andreas H. :

(De)Serialization of in-memory data structures is an important application. 
However there is a rather unpleasant issue with ForwardRefs.


One cannot export type aliases when they contain ForwardRefs (and expect 
things to work).


Consider the example:

Json = Union[ List['Json'], Dict[str, 'Json'], int, float, bool, None ]


When used in another module the containing ForwardRefs cannot be resolved, or 
in the worst case, resolve to 
wrong types if in the caller namespace contains a symbol with the same name as 
ForwardRef refers to.

This of course only applies to inspection-based tools which utilize the 
run-time information, not the static type checkers.


Type aliases sometimes have to be used sometimes - they cannot be avoided in 
all cases, 
especially with recursive types as in the example above.  



There are several options to improve the situation. These are all that came to 
my mind and I want to expose them to discussion.



1. Guard with NewType

Json = NewType('Json', Union[ List['Json'], Dict[str, 'Json'], int, float, 
bool, None ] )
   
   Advantage: This could allow automatic cross-module ForwardRef resolution 
provided issue 46369 [1] is acknowledged as a bug and fixed. 
   Disadvantage: Does not create a true type alias, but a sub-type. Type casts 
have to be used all the time, e.g.   `data = Json(bool)`.
  So can only applied to a subset of use-cases (but is IMO a clean solution 
when it fits). 


2. Use the `module` parameter of ForwardRef

Json = Union[ List[ForwardRef('Json', module=__name__)], Dict[str, 
ForwardRef('Json', module=__name__)], int, float, bool, None ] )

   Advantage: Works in 3.10. 
   Disadvantage: Would require issue 46333 [2] to be fixed. ForwardRef is not 
meant to be instatiated by the user, 
   also `module` parameter is currently completely internal.


3. Modify ForwardRef so that it accepts a fully qualified name

Json = Union[ List[__name__+'.Json'], Dict[str, __name__+'.Json'], int, 
float, bool, None ] )

   Advantage: This is only a tiny change (because ForwardRef has the `module` 
parameter). ForwardRef would stay internal. Less ugly than 2.
   Disadvantage: Still a bit ugly. Would also require issue 46333 [2] to be 
fixed. Relative module specification (as in relative imports) 
  would not work.


4. Manual evaluation

Json = Union[ List['Json'], Dict[str, 'Json'], int, float, bool, None ]
resolve_type_alias(Json, globals(), locals() )


def resolve_type_alias(type, globalns, localns):
class dummy:
test: type
typing.get_type_hints(dummy, globalns, localns) # Note: this modifies 
ForwardRefs in-place

Advantage: Works in many versions.
Disadvantage: Requires user to explicily call function after the last 
referenced type is defined 
   (this may be physically separated from the type alias definition, which 
does not feel like a good solution especially since 
this ForwardRef export problem is only technical, and not even close to 
beeing obvious to most people)


5. Make `get_type_hints()` to work with type-aliases (essentially integrate 
above `resolve_type_alias`). The return value 
   of get_type_hints() would be the same as argument, just with ForwardRefs 
in-place resolved. 
 
 Json = Union[ List['Json'], Dict[str, 'Json'], int, float, bool, None ]
 get_type_hints(Json, globals(), locals())
 
   Advantage: same as 4) but hides useful (but ugly) code
   Disadvantage: same as 4)


 
6. Make all types in typing (such as List, Dict, Union, etc...) to capture 
their calling module and pass this information to ForwardRef, when 
   one is to be created. Then already during construction of ForwardRef the 
`module` will be correctly set.

 Json = Union[ List['Json'], Dict[str, 'Json'], int, float, bool, None ]

   Advantage: This requires no user intervention. Things will "just work"
   Disadvantage: This is rather big change. It is incompatible with the caching 
used inside typing.py (the new __module__ parameter would 
   need to be taken into account in __hash__ and/or __eq__). And maybe has 
other issues I do not see at the moment.


7. Extend `TypeAlias` hint so it can be used in bracket  way similar to e.g. 
`Annotated`

 Json = TypeAlias[ Union[ List['Json'], Dict[str, 'Json'], int, float, 
bool, None ]  ]
   

   I know, presently it is supposed to be used as  `Json: TypeAlias = Union[ 
 ]`. But that is of no help at run-time, because
   the variable Json contains no run-time information. So even if TypeAlias 
would capture the calling module, 
   this information is not passed on to the variable `Json`. This is different 
for the bracket notation TypeAlias[ .. ]. 

   Advantage: Similar usage to Annotated. 

[issue46373] TypedDict and NamedTuple do not evaluate cross-module ForwardRef in all cases

2022-01-14 Thread Andreas H.


New submission from Andreas H. :

TypedDict does not resolve cross-module ForwardRefs when the ForwardRef is not 
a direct one. 

In other words the fix GH-27017 (issue 41249) for TypedDict seems incomplete.

The same issue seem to exist for NamedTuple.



Example:

   #module.py
   TD = typing.TypedDict("TD", {'test': typing.List[typing.Optional['Y']]})
   class Y:
  pass


   # other module
   class TDSub(module.TD):
   a: int
   get_type_hints(TDSub)
   # -> Exception   NameError: Y not found


On the other hand, with direct ForwardRef, as e.g. in 
   TD = typing.TypedDict("TD", {'test':  'Y' } )

it works (that was indeed fixed by GH-27017)



Same issue exists for NamedTuple. There neither of the above works, i.e. 
cross-module ForwardRefs are never resolve (but they could - als NamedTuple has 
the __module__ member set with to calling module). I am not sure if inheritance 
for NamedTuple is supported so I do not know if it is really a bug. 


The problem in the code is that in TypedDict the `module` parameter is passed 
only onto the immediate ForwardRef. One option could be to recursively walk the 
type and search for unpatched ForwardRefs and set the module parameter.
On the other hand, the retroactive patching of forward refs is problematic as 
it may mess with the caching mechanism im typing.py. There may be a type with 
ForwardRef already in cache (and used by more than one user), but only for one 
user the `module` is to be updated. So probably a correct implementation is 
tricky, or some other way has to be found to update the `module` of ForwardRefs 
(e.g. by copying the type tree).


For NamedTuple the whole mechanism of passing the `module` parameter to the 
ForwardRefs is not done (not even for direct ForwardRef ones).


Not sure how important this (likely not very) is as I do not use TypedDict and 
NamedTuple. This is just to report it.

--
components: Library (Lib)
messages: 410547
nosy: AlexWaygood, Jelle Zijlstra, andreash, gvanrossum, kj, kumaraditya303, 
sobolevn
priority: normal
severity: normal
status: open
title: TypedDict and NamedTuple do not evaluate cross-module ForwardRef in all 
cases
type: behavior
versions: Python 3.10, Python 3.11

___
Python tracker 
<https://bugs.python.org/issue46373>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46369] get_type_hints does not evaluate ForwardRefs inside NewType

2022-01-14 Thread Andreas H.


Andreas H.  added the comment:

Allright. B) sounds good to me. I dont think I have time today, so please feel 
to tackle the issue. If not I can look at it the next week.

--

___
Python tracker 
<https://bugs.python.org/issue46369>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46333] ForwardRef.__eq__ does not respect module parameter

2022-02-11 Thread Andreas H.


Change by Andreas H. :


--
pull_requests: +29443
pull_request: https://github.com/python/cpython/pull/31283

___
Python tracker 
<https://bugs.python.org/issue46333>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue45184] Add `pop` function to remove context manager from (Async)ExitStack

2022-02-23 Thread Andreas H.


Andreas H.  added the comment:

Inside the discussion an ExitPool class is sketched 
(https://mail.python.org/archives/list/python-id...@python.org/message/66W55FRCYMYF73TVMDMWDLVIZK4ZDHPD/),
 which provides this removal of context managers.

What I learned is that this would have different cleanup mode (atexit style), 
as compared to present ExitStack cleanup (nested style). 

So contrary to what I was originally thinking ExitPool functionality would be 
close to, but not a strict superset of ExitStack functionality. Still such an 
ExitPool functionality would be extremely useful.

--

___
Python tracker 
<https://bugs.python.org/issue45184>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11975] Fix referencing of built-in types (list, int, ...)

2011-05-28 Thread Jonas H.

Jonas H.  added the comment:

Does that look good to you? If it does, I'll go on using the script 
(http://paste.pocoo.org/show/396661/) on the 3.x docs.

--
keywords: +patch
Added file: http://bugs.python.org/file22164/p1.patch

___
Python tracker 
<http://bugs.python.org/issue11975>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11975] Fix referencing of built-in types (list, int, ...)

2011-05-29 Thread Jonas H.

Jonas H.  added the comment:

Linking a class using a function directive is counter-intuitive. That's why we 
need to make use of class directives rather than function directives here.

--

___
Python tracker 
<http://bugs.python.org/issue11975>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11975] Fix referencing of built-in types (list, int, ...)

2011-05-30 Thread Jonas H.

Jonas H.  added the comment:

I'm not.

My patch doesn't address the problem of unlinkable methods but wrong type 
declarations (read, wrong usage of ".. function::" directives) for builtins 
like int, float, bool, list etc. Because the directives change, the roles used 
to link to them (":func:`list`") have to be changed accordingly.  That's what 
this patch does.

I want to address `list` method documentation in the next step.

--

___
Python tracker 
<http://bugs.python.org/issue11975>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11975] Fix referencing of built-in types (list, int, ...)

2011-05-30 Thread Jonas H.

Jonas H.  added the comment:

> Could you make an effort to accept our word that using :class: instead of 
> :func: would bring zero value to the indexing system nor to human readers?

I'm already doing; but I don't see anyone having made a good point against my 
preference of using ".. class::" to document classes.

--

___
Python tracker 
<http://bugs.python.org/issue11975>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11975] Fix referencing of built-in types (list, int, ...)

2011-05-30 Thread Jonas H.

Jonas H.  added the comment:

What's wrong with the changes I propose with the patch, then? Sorry, I really 
don't get it, no matter how hard I try.

--

___
Python tracker 
<http://bugs.python.org/issue11975>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com




[issue11975] Fix referencing of built-in types (list, int, ...)

2011-05-30 Thread Jonas H.

Jonas H.  added the comment:

> when you mark up something with a mod, func, class or meth role, Sphinx will 
> find the target without paying attention to its type.  So changing :func: to 
> :class: does not bring anything.

>From a quick test this seems to hold true for links within one project but not 
>for Sphinx' intersphinx extension, which actually cares about types.

So the question is whether we keep CPython implementation details (many 
builtins being both a class and a function) out of the documentation or we get 
the Sphinx developers to change intersphinx behaviour.  I guess you'd suggest 
the latter, right? :-)

--

___
Python tracker 
<http://bugs.python.org/issue11975>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11975] Fix referencing of built-in types (list, int, ...)

2011-05-30 Thread Jonas H.

Jonas H.  added the comment:

> So the intersphinx behavior is the "correct" one, but we can't change the 
> other now because of compatibility.

Could you be convinced to use that legacy behaviour for intersphinx, too? :-)

--

___
Python tracker 
<http://bugs.python.org/issue11975>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11975] Fix referencing of built-in types (list, int, ...)

2011-06-01 Thread Jonas H.

Jonas H.  added the comment:

> Jonas, I owe you an apology [...]

Thanks Éric, I got a bit worried about getting on your nerves...

Based on Ezio's idea: What happens if we have both a ".. function:: foo" and 
".. class:: foo" -- where do :func:`foo` and :class:`foo` link to (internally 
and using intersphinx)?

--

___
Python tracker 
<http://bugs.python.org/issue11975>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11975] Fix referencing of built-in types (list, int, ...)

2011-06-07 Thread Jonas H.

Jonas H.  added the comment:

Having one page with two objects of the same name, e.g.

  .. function:: foo

  .. class:: foo

renders to two entries with the same anchor name (#foo). The first entry gets a 
link-to-this-paragraph marker, the second one doesn't.

Internal references (from within the same document) always link to the first 
entry because they use #foo anchor. (So if you put the class directive first, 
all links go to the class anchor.)

The first external reference (using intersphinx) always goes to the first 
target document element - no matter which type both have. The second reference 
isn't turned into a hyperlink.

This behaviour seems consistent with how HTML anchors work.

Having the two objects on two different pages however shows slightly odd 
results. Say we have this code on page 1:

  .. class:: foo

  :class:`foo`
  :func:`foo`

and

  .. function:: foo

on page 2, then both links in page 1 go to the page 1 'foo' (the class). 
However if you change order (putting the func role before the class role), 
those links go to the page 2 'foo' (the function).

All intersphinx-ed links go to the object on page 1, no matter the role order 
on page 1 or the external page.


I think we can conclude that using class and function directives at the same 
time doesn't help very much...

--

___
Python tracker 
<http://bugs.python.org/issue11975>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue12284] argparse.ArgumentParser: usage example option

2011-06-08 Thread Jonas H.

New submission from Jonas H. :

I'd like to see an `examples` option added to argparse.ArgumentParser as found 
in many man pages.

This could also be done using the `epilog` option, but that misses the 
"%(proc)s" replacement which makes usage examples like this

  Example usage:
./script.py option1 option2

impossible.

--
components: Library (Lib)
messages: 137905
nosy: jonash
priority: normal
severity: normal
status: open
title: argparse.ArgumentParser: usage example option
type: feature request
versions: Python 2.7

___
Python tracker 
<http://bugs.python.org/issue12284>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue12284] argparse.ArgumentParser: usage example option

2011-06-09 Thread Jonas H.

Jonas H.  added the comment:

Nope. I want an "examples" section, for example from `man git log`:


EXAMPLES
   git log --no-merges
   Show the whole commit history, but skip any merges

   git log v2.6.12.. include/scsi drivers/scsi
   Show all commits since version v2.6.12 that changed any file in the 
include/scsi or drivers/scsi subdirectories

   ...

--

___
Python tracker 
<http://bugs.python.org/issue12284>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue12877] Popen(...).stdout.seek(...) throws "Illegal seek"

2011-09-01 Thread Jonas H.

New submission from Jonas H. :

from subprocess import Popen, PIPE
p = Popen(['ls'], stdout=PIPE)
p.wait()
p.stdout.seek(0)


Traceback (most recent call last):
  File "t.py", line 5, in 
p.stdout.seek(0)
IOError: [Errno 29] Illegal seek

Python 2.7.2, Arch Linux x86-64 (Kernel 3.0)

--
messages: 143323
nosy: jonash
priority: normal
severity: normal
status: open
title: Popen(...).stdout.seek(...) throws "Illegal seek"
versions: Python 2.7

___
Python tracker 
<http://bugs.python.org/issue12877>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue12877] Popen(...).stdout.seek(...) throws "Illegal seek"

2011-09-01 Thread Jonas H.

Jonas H.  added the comment:

Why does it have a 'seek' method then?

--

___
Python tracker 
<http://bugs.python.org/issue12877>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue1731717] race condition in subprocess module

2010-09-06 Thread Jonas H.

Changes by Jonas H. :


--
nosy: +jonash

___
Python tracker 
<http://bugs.python.org/issue1731717>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11258] ctypes: Speed up find_library() on Linux by 500%

2011-02-24 Thread Jonas H.

Changes by Jonas H. :


Added file: 
http://bugs.python.org/file20874/faster-find-library1-py3k-with-escaped-name.diff

___
Python tracker 
<http://bugs.python.org/issue11258>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11258] ctypes: Speed up find_library() on Linux by 500%

2011-02-24 Thread Jonas H.

Jonas H.  added the comment:

As far as I can tell, it doesn't matter.

We're looking for the part after the => in any case - ignoring the 
ABI/architecture information - so the regex would chose the first of those 
entries.

--

___
Python tracker 
<http://bugs.python.org/issue11258>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11258] ctypes: Speed up find_library() on Linux by 500%

2011-02-27 Thread Jonas H.

Jonas H.  added the comment:

Humm. Would be great to have the `ldconfig -p` output of such a machine... I 
can't get ldconfig to recognize 64-bit libraries on my 32-bit machines, so I 
have no output to test against...

--

___
Python tracker 
<http://bugs.python.org/issue11258>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11345] Fix a link in library/json docs

2011-02-27 Thread Jonas H.

New submission from Jonas H. :

I guess this should be a link.

--
assignee: docs@python
components: Documentation
files: fix-json-link.diff
keywords: patch
messages: 129629
nosy: docs@python, jonash
priority: normal
severity: normal
status: open
title: Fix a link in library/json docs
versions: Python 3.3
Added file: http://bugs.python.org/file20932/fix-json-link.diff

___
Python tracker 
<http://bugs.python.org/issue11345>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11258] ctypes: Speed up find_library() on Linux by 500%

2011-02-27 Thread Jonas H.

Jonas H.  added the comment:

> the orig impl matches the abi_type at the beginning of the parentheses,
> yours simply ignores the abi_type (that should have caught my eye, but that
> regex looked so much like magic that I didn't try to make sense of it :-))

Same here. :)

The version I attached seems to work for me. It's some kind of compromise -- 
basically it's the original regex but with the unneccessary, 
performance-decreasing cruft stripped away.

btw, "Jonas H." is perfectly fine - I don't care about being honored, I just 
want to `import uuid` without waiting forever. :-)

--
Added file: 
http://bugs.python.org/file20935/faster-find-library1-py3k-with-escaped-name-try2.diff

___
Python tracker 
<http://bugs.python.org/issue11258>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue4600] __class__ assignment: new-style? heap? == confusing

2011-02-27 Thread Jonas H.

Jonas H.  added the comment:

Here comes a patch, changing the behaviour to:

./python -q
>>> class C:
...   pass
... 
>>> (1).__class__ = 1
Traceback (most recent call last):
  File "", line 1, in 
TypeError: __class__ must be set to a class defined by a class statement, not 
'int' object
>>> (1).__class__ = object
Traceback (most recent call last):
  File "", line 1, in 
TypeError: class__ must be set to a class defined by a class statement, not 
'object'
>>> (1).__class__ = C
Traceback (most recent call last):
  File "", line 1, in 
TypeError: __class__ assignment: only for instances of classes defined by class 
statements

--
keywords: +patch
nosy: +jonash
Added file: http://bugs.python.org/file20937/4600.diff

___
Python tracker 
<http://bugs.python.org/issue4600>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11484] `with_traceback` in 2.7 docs but not implemented

2011-03-13 Thread Jonas H.

New submission from Jonas H. :

Either a `BaseException.with_traceback` implementation is missing or the docs 
are wrong.

http://docs.python.org/library/exceptions.html?highlight=with_traceback#exceptions.BaseException.with_traceback

python3 -c 'print("with_traceback" in dir(BaseException))'
True
python2 -c 'print("with_traceback" in dir(BaseException))'
False

--
assignee: docs@python
components: Documentation
messages: 130760
nosy: docs@python, jonash
priority: normal
severity: normal
status: open
title: `with_traceback` in 2.7 docs but not implemented
versions: Python 2.7

___
Python tracker 
<http://bugs.python.org/issue11484>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11258] ctypes: Speed up find_library() on Linux by 500%

2011-04-19 Thread Jonas H.

Jonas H.  added the comment:

*push* Any way to get this into the codebase?

--

___
Python tracker 
<http://bugs.python.org/issue11258>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11975] Fix intersphinx-ing of built-in types (list, int, ...)

2011-05-01 Thread Jonas H.

New submission from Jonas H. :

Intersphinx-ing of int, list, float, ... should work with ":class:`int`" (list, 
float, ...). Also, intersphinx-ing list methods, e.g. ":meth:`list.insert`", 
should work.

--
assignee: docs@python
components: Documentation
messages: 134923
nosy: docs@python, jonash
priority: normal
severity: normal
status: open
title: Fix intersphinx-ing of built-in types (list, int, ...)

___
Python tracker 
<http://bugs.python.org/issue11975>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11976] Provide proper documentation for list data type

2011-05-01 Thread Jonas H.

New submission from Jonas H. :

Provide a proper `list` method reference (like the one for `dict`, 
http://docs.python.org/library/stdtypes.html#dict).

Right now, documentation about lists is spread over multiple topics (.rst 
files) and methods are documented in footnotes.

Also, intersphinx-ing and list methods is not possible -- :meth:`list.foo` does 
not create any links due to missing documentation. This is also related to 
#11975.

--
assignee: docs@python
components: Documentation
messages: 134924
nosy: docs@python, jonash
priority: normal
severity: normal
status: open
title: Provide proper documentation for list data type

___
Python tracker 
<http://bugs.python.org/issue11976>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11977] Document int.conjugate, .denominator, ...

2011-05-01 Thread Jonas H.

New submission from Jonas H. :

Various `int` attributes and methods seem undocumented (at least it does not 
work to intersphinx them):

* .conjugate
* .denominator
* .imag
* .numerator
* .real

--
assignee: docs@python
components: Documentation
messages: 134926
nosy: docs@python, jonash
priority: normal
severity: normal
status: open
title: Document int.conjugate, .denominator, ...
versions: Python 2.7

___
Python tracker 
<http://bugs.python.org/issue11977>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11975] Fix referencing of built-in types (list, int, ...)

2011-05-02 Thread Jonas H.

Jonas H.  added the comment:

Actually I need to be able to intersphinx (because my documentation work is not 
the Python docs :-) but I guess it boils down to the same problem of incomplete 
Sphinx module/class indices.

--

___
Python tracker 
<http://bugs.python.org/issue11975>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11975] Fix referencing of built-in types (list, int, ...)

2011-05-02 Thread Jonas H.

Jonas H.  added the comment:

Indeed they do; but documentation writers need to know that `int()` and 
`float()` are functions, which is counterintuitive. (and a CPython 
implementation detail)

--

___
Python tracker 
<http://bugs.python.org/issue11975>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11975] Fix referencing of built-in types (list, int, ...)

2011-05-06 Thread Jonas H.

Jonas H.  added the comment:

> Is this a problem in our markup or a bug in intersphinx?

It's a markup problem -- those types are documented as functions, using the 
:func: role/`.. func::` directive.

It's not only a markup mismatch but, strictly speaking, it's *wrong* 
documentation: str, int, ... aren't functions, they're *classes* and should be 
documented as such. It's a bit odd to search for information on the str *class* 
in the list of built-in *functions*.


> If this work within one documentation set (as I believe it does)

It does not. For example, in http://docs.python.org/library/functions.html#max 
there's a reference to list.sort using :meth:`list.sort` but no link could be 
generated. How could it possibly work without decent documentation about the 
list data type?

--

___
Python tracker 
<http://bugs.python.org/issue11975>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11977] Document int.conjugate, .denominator, ...

2011-05-06 Thread Jonas H.

Jonas H.  added the comment:

It doesn't. Sphinx still can't make any links, which btw also means that it's 
impossible to reference those methods within the Python documentation.

Also I want to point out that I find the information very hard to find as a 
human. The fact that integers are based on `numbers.Number` is -- at this point 
in time where 95% of all Python developers don't know about the `numbers` 
module or abstract base classes in general -- an implementation detail and as 
such should not affect the way `int` is documented.

I propose to have decent class references for int, str, ... similar to the 
reference for dict -- that is, document all attributes and methods in one place 
and make them referencable. For those who want to deep-dive into CPython 
internals, a note about those functionality actually being implemented in ABCs 
could be added.

(But I think that's out of scope for this ticket. I could open a new one if 
anyone agrees with me... :-)

--

___
Python tracker 
<http://bugs.python.org/issue11977>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11975] Fix referencing of built-in types (list, int, ...)

2011-05-06 Thread Jonas H.

Jonas H.  added the comment:

Shouldn't have used "decent" here, sorry. What I was trying to say is that 
there's no "reference-like" documentation for the list datatype (as for dict). 
There's more than enough quality documentation about lists but I think the way 
it's arranged can be improved.

--

___
Python tracker 
<http://bugs.python.org/issue11975>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11106] python 2.6.6 and python 2.7.1 cannot be built successfully because of an segment fault on NetBSD-5.1-sparc

2011-02-03 Thread H Xu

New submission from H Xu :

Build python 2.6.6 and python 2.7.1 on a NetBSD-5.1-sparc machine.

1. Run './configure';
2. Run 'make';
3. Run 'make install'.

There will be a problem after run 'make install'.
The last few lines of error messages are like the following:

Compiling /usr/local/lib/python2.6/test/test_binop.py ...
Compiling /usr/local/lib/python2.6/test/test_bisect.py ...
Compiling /usr/local/lib/python2.6/test/test_bool.py ...
Compiling /usr/local/lib/python2.6/test/test_bsddb.py ...
Compiling /usr/local/lib/python2.6/test/test_bsddb185.py ...
Compiling /usr/local/lib/python2.6/test/test_bsddb3.py ...
Compiling /usr/local/lib/python2.6/test/test_buffer.py ...
Compiling /usr/local/lib/python2.6/test/test_bufio.py ...
Compiling /usr/local/lib/python2.6/test/test_builtin.py ...
[1]   Segmentation fault (core dumped) PYTHONPATH=/usr/...
*** Error code 139

Stop.
make: stopped in /home/xuh/src/Python-2.6.6

Same thing with python 2.7.1.

--
components: Build
messages: 127802
nosy: H.Xu
priority: normal
severity: normal
status: open
title: python 2.6.6 and python 2.7.1 cannot be built successfully because of an 
segment fault on NetBSD-5.1-sparc
type: compile error
versions: Python 2.6, Python 2.7

___
Python tracker 
<http://bugs.python.org/issue11106>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11106] python 2.6.6 and python 2.7.1 cannot be built successfully because of an segment fault on NetBSD-5.1-sparc

2011-02-03 Thread H Xu

H Xu  added the comment:

The result of 'make install SHELL="bash -x"' seems nothing different with the 
"make install". Could there be any other way to debug?

--

___
Python tracker 
<http://bugs.python.org/issue11106>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11258] ctypes: Speed up find_library() on Linux by 500%

2011-02-20 Thread Jonas H.

New submission from Jonas H. :

(This applies to all versions of Python I investigated, although the attached 
patch is for Python 2.7)

I wondered why `import uuid` took so long, so I did some profiling.

It turns out that `find_library` wastes at lot of time because of this crazy 
regular expression in `_findSoname_ldconfig`.

A quick look at the ldconfig source (namely, the print_cache routine which is 
invoked when you call `ldconfig -p`, 
http://sourceware.org/git/?p=glibc.git;a=blob;f=elf/cache.c#l127) confirmed my 
suspicion that the ldconfig's output could easily be parsed without such a 
regex monster.

I attached two patches that fix this problem. Choose one! ;-)

The ctypes tests pass with my fixes, and here comes some benchmarking:

$ cat benchmark_ctypes.py 
from ctypes.util import find_library
for i in xrange(10):
  for lib in ['mm', 'c', 'bz2', 'uuid']:
find_library(lib)

# Current implementation
$ time python benchmark_ctypes.py 
real0m11.813s
...
$ time python -c 'import uuid'
real0m0.625s
...

# With my patch applied
$ cp /tmp/ctypesutil.py ctypes/util.py
$ time python benchmark_ctypes.py 
real0m1.785s
...
$ time python -c 'import uuid'
real0m0.182s
...

--
assignee: theller
components: ctypes
files: faster-find-library1.diff
keywords: patch
messages: 128910
nosy: jonash, theller
priority: normal
severity: normal
status: open
title: ctypes: Speed up find_library() on Linux by 500%
type: performance
versions: Python 2.5, Python 2.6, Python 2.7, Python 3.1, Python 3.2, Python 3.3
Added file: http://bugs.python.org/file20808/faster-find-library1.diff

___
Python tracker 
<http://bugs.python.org/issue11258>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11258] ctypes: Speed up find_library() on Linux by 500%

2011-02-20 Thread Jonas H.

Changes by Jonas H. :


Added file: http://bugs.python.org/file20809/faster-find-library2.diff

___
Python tracker 
<http://bugs.python.org/issue11258>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue11258] ctypes: Speed up find_library() on Linux by 500%

2011-02-20 Thread Jonas H.

Jonas H.  added the comment:

(might also be related to http://bugs.python.org/issue11063)

--

___
Python tracker 
<http://bugs.python.org/issue11258>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue8415] namedtuple vs tuple

2010-04-15 Thread H Krishnan

New submission from H Krishnan :

Named tuples and tuples have different creation behavior. Changing a tuple to a 
namedtuple will involve changing the usage as well. For example:

>>> ntuple = collections.namedtuple("ntuple", "a,b")
>>> ntuple(1,2)
ntuple(a=1, b=2)
>>> tuple(1,2)
Traceback (most recent call last):
  File "", line 1, in 
TypeError: tuple() takes at most 1 argument (2 given)
>>> tuple([1,2])
(1, 2)
>>> ntuple([1,2])
Traceback (most recent call last):
  File "", line 1, in 
TypeError: __new__() takes exactly 3 arguments (2 given)
>>>

Because of this, to create a tuple object given a 'tuple class', we need to do 
something like:
def makeTuple(tupleCls, *args):
   if hasattr(tupleCls, "_fields"):
  return tupleCls(*args)
   else:
  return tupleCls(args)

My suggestion: A namedtuple should also accept a single iterable as argument, 
in which case, the iterable will be broken up and assigned to individual fields.
This will break an existing behaviour of namedtuple: if only one field is 
present in the namedtuple and an iterable is passed to the namedtuple, that 
field is currently assigned the iterable. However, namedtuples are seldom used 
for single fields and so this may not be that important.

--
components: None
messages: 103289
nosy: hkrishnan
severity: normal
status: open
title: namedtuple vs tuple
type: feature request
versions: Python 2.6

___
Python tracker 
<http://bugs.python.org/issue8415>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue8415] namedtuple vs tuple

2010-04-18 Thread H Krishnan

H Krishnan  added the comment:

Sorry, I didn't know about "python-ideas".
Actually, there is a way to do this without breaking any existing code.

namedtuple could support an optional additional argument, say, useIterableCtr, 
which is by default False, and the class template could be appropriately 
modified based on this argument.

I couldn't find the PEP for this, but I notice that other people have also 
suggested iterable as argument in the ActiveState recipe page for this.

--

___
Python tracker 
<http://bugs.python.org/issue8415>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue46657] Add mimalloc memory allocator

2022-03-23 Thread h-vetinari


Change by h-vetinari :


--
nosy: +h-vetinari

___
Python tracker 
<https://bugs.python.org/issue46657>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue38444] dataclass: always generate default __init__ on __default_init__

2019-10-11 Thread Shmuel H.


New submission from Shmuel H. :

Currently, `dataclasses.dataclass` will generate `__init__` only where the user 
has not defined one. 

However, sometimes, with frozen classes or dataclasses with a lot of members, 
redefinition of this function is not trivial,
especially if the only purpose is to change the default behaviour for only one 
member:
```python
from dataclasses import dataclass

@dataclass(frozen=True)
class Dataclass:
#...big list of members
member20: int

def __init__(self, member20: str, **kwargs):
# self.member20 = int(member20)
object.__setattr__(self, "member20", int(member20))
# Now we have to trivially initialize 
# 20 other members like that :[
```
My idea is to generate the default `__init__` into `__default_init__` even, if 
the user has defined their own version.
That will allow them to use it like that:
 ```python
from dataclasses import dataclass

@dataclass(frozen=True)
class Dataclass:
#...big list of members
member20: int

def __init__(self, member20: str, **kwargs):
# Oh, that's better :)
self.__default_init__(member20=int(member20), **kwargs)
```

Implementing that is pretty trivial (I can do that if this change will be 
approved). 
Please let me know what you think about that.

--
components: Library (Lib)
messages: 354437
nosy: Shmuel H.
priority: normal
severity: normal
status: open
title: dataclass: always generate default __init__ on __default_init__
type: enhancement
versions: Python 3.7, Python 3.8, Python 3.9

___
Python tracker 
<https://bugs.python.org/issue38444>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue38444] dataclass: always generate default __init__ on __default_init__

2019-10-11 Thread Shmuel H.


Shmuel H.  added the comment:

I think it was designed to. However, it is not very usable in production for a 
number of reasons:
1. It won't work with frozen instances (you'll have to call 
`object.__setattr__` directly).
2. It gets very messy with more than one or two `InitVar`s which makes it very 
hard to differentiate between "real"
 values, `InitVar`s and the init logic:
```python
from dataclasses import dataclass, InitVar
@dataclass
class DataClass:
member0_init: InitVar[str] = None
member1_init: InitVar[list] = None

member0: int = None
member1: dict = None

def __post_init__(self, member0_init: str, member1_init: list):
if member0_init is not None and self.member0 is None:
self.member0 = int(member0_init)
if member1_init is not None and self.member1 is None:
self.member1 = dict(member1_init)
```
That code should be equivalent to:
```python
from dataclasses import dataclass
from typing import Union
@dataclass
class DataClass:
member0: int
member1: dict

def __init__(self, member0: Union[int, str], member1: Union[dict, list]):
if isinstance(member0, str):
member0 = int(member0)
if isinstance(member1, list):
member1 = dict(member1)

self.__default_init__(member0=member0, member1=member1)
```
Which is much closer to regular python code to someone new for dataclasses.

I would be happy to hear if you have a better solution; I just think it is 
pretty simple and straight-forward.

--

___
Python tracker 
<https://bugs.python.org/issue38444>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue38444] dataclass: always generate default __init__ on __default_init__

2019-10-11 Thread Shmuel H.


Shmuel H.  added the comment:

The only other solution I could think about was to change setattr's behaviour 
dynamically so that it would be valid to call it from frozen instance's 
`__init__`, but I think it is somehow even worst.

However, thanks for your help, I think we can close this one for now and I'll 
hopefully write that mail in the next day or two.

As for other projects, I doubt I'll find any big projects that use frozen 
dataclasses internally, but I'll try my best to come with one.

(Only now I realize that your the one behind python's dataclasses, keep up the 
good work!).

--

___
Python tracker 
<https://bugs.python.org/issue38444>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue43819] ExtensionFileLoader Does Not Implement invalidate_caches

2021-04-12 Thread Ian H


New submission from Ian H :

Currently there's no easy way to get at the internal cache of module spec 
objects for compiled extension modules. See 
https://github.com/python/cpython/blob/20ac34772aa9805ccbf082e700f2b033291ff5d2/Python/import.c#L401-L415.
 For example, these module spec objects continue to be cached even if we call 
importlib.invalidate_caches. ExtensionFileLoader doesn't implement the 
corresponding method for this.

The comment in the C file referenced above implies this is done this way to 
avoid re-initializing extension modules. I'm not sure if this can be fixed, but 
I figured I'd ask for input. Our use-case is an academic project where we've 
been experimenting with building an interface for linker namespaces into Python 
to allow for (among other things) loading multiple copies of any module without 
explicit support from that module. We've been able to do this without having 
custom builds of Python. We've instead gone the route of overriding some of the 
import machinery at runtime. To make this work we need a way to prevent caching 
of previous import-related information about a specific extension module. We 
currently have to rely on an unfortunate hack to get access to the internal 
cache of module spec objects for extension modules and modify that dictionary 
manually. What we have works, but any sort of alternative would be welcome.

--
messages: 390905
nosy: Ian.H
priority: normal
severity: normal
status: open
title: ExtensionFileLoader Does Not Implement invalidate_caches
type: behavior
versions: Python 3.9

___
Python tracker 
<https://bugs.python.org/issue43819>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue43870] C API Functions Bypass __import__ Override

2021-04-16 Thread Ian H

New submission from Ian H :

Some of the import-related C API functions are documented as bypassing an 
override to builtins.__import__. This appears to be the case, but the 
documentation is incomplete in this regard. For example, PyImport_ImportModule 
is implemented by calling PyImport_Import which does respect an override to 
builtins.__import__, but PyImport_ImportModule doesn't mention respecting an 
override. On the other hand some routines (like 
PyImport_ImportModuleLevelObject) do not respect an override to the builtin 
import.

Is this something that people are open to having fixed? I've been working on an 
academic project downstream that involved some overrides to the __import__ 
machinery (I haven't figured out a way to do this with just import hooks) and 
having some modules skip going through our override threw us for a bad 
debugging loop. The easiest long-term fix from our perspective is to patch the 
various PyImport routines to always respect an __import__ override. This 
technically is a backwards compatibility break, but I'm unsure if anyone is 
actually relying on the fact that specific C API functions bypass 
builtins.__import__ entirely. It seems more likely that the current behavior 
will cause bugs downstream like it did for us.

--
messages: 391220
nosy: Ian.H
priority: normal
severity: normal
status: open
title: C API Functions Bypass __import__ Override
type: behavior
versions: Python 3.9

___
Python tracker 
<https://bugs.python.org/issue43870>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue43895] Unnecessary Cache of Shared Object Handles

2021-04-20 Thread Ian H


New submission from Ian H :

While working on another project I noticed that there's a cache of shared 
object handles kept inside _PyImport_FindSharedFuncptr. See 
https://github.com/python/cpython/blob/b2b6cd00c6329426fc3b34700f2e22155b44168c/Python/dynload_shlib.c#L51-L55.
 It appears to be an optimization to work around poor caching of shared object 
handles in old libc implementations. After some testing, I have been unable to 
find any meaningful performance difference from this cache, so I propose we 
remove it to save the space.

My initial tests were on Linux (Ubuntu 18.04). I saw no discernible difference 
in the time for running the Python test suite with a single thread. Running the 
test suite using a single thread shows a lot of variance, but after running 
with and without the cache 40 times the mean times with/without the cache was 
nearly the same. Interpreter startup time also appears to be unaffected. This 
was all with a debug build, so I'm in the process of collecting data with a 
release build to see if that changes anything.

--
messages: 391453
nosy: Ian.H
priority: normal
severity: normal
status: open
title: Unnecessary Cache of Shared Object Handles
versions: Python 3.10

___
Python tracker 
<https://bugs.python.org/issue43895>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue43895] Unnecessary Cache of Shared Object Handles

2021-04-20 Thread Ian H


Ian H  added the comment:

Proposed patch is in https://github.com/python/cpython/pull/25487.

--

___
Python tracker 
<https://bugs.python.org/issue43895>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue43895] Unnecessary Cache of Shared Object Handles

2021-04-23 Thread Ian H


Change by Ian H :


--
keywords: +patch
pull_requests: +24282
stage:  -> patch review
pull_request: https://github.com/python/cpython/pull/25487

___
Python tracker 
<https://bugs.python.org/issue43895>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue1635741] Py_Finalize() doesn't clear all Python objects at exit

2021-06-29 Thread h-vetinari


Change by h-vetinari :


--
nosy: +h-vetinari

___
Python tracker 
<https://bugs.python.org/issue1635741>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue40077] Convert static types to heap types: use PyType_FromSpec()

2021-06-29 Thread h-vetinari


Change by h-vetinari :


--
nosy: +h-vetinari

___
Python tracker 
<https://bugs.python.org/issue40077>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue39511] [subinterpreters] Per-interpreter singletons (None, True, False, etc.)

2021-06-29 Thread h-vetinari


Change by h-vetinari :


--
nosy: +h-vetinari

___
Python tracker 
<https://bugs.python.org/issue39511>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue40601] [C API] Hide static types from the limited C API

2021-06-29 Thread h-vetinari


Change by h-vetinari :


--
nosy: +h-vetinari

___
Python tracker 
<https://bugs.python.org/issue40601>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue15751] [subinterpreters] Make the PyGILState API compatible with subinterpreters

2021-06-29 Thread h-vetinari


Change by h-vetinari :


--
nosy: +h-vetinari

___
Python tracker 
<https://bugs.python.org/issue15751>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue40522] [subinterpreters] Get the current Python interpreter state from Thread Local Storage (autoTSSkey)

2021-06-29 Thread h-vetinari


Change by h-vetinari :


--
nosy: +h-vetinari

___
Python tracker 
<https://bugs.python.org/issue40522>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue44795] asyncio.run does not allow for graceful shutdown of main task

2021-07-31 Thread Andreas H.


New submission from Andreas H. :

The issue is that the main task (which was supplied to asyncio.run) has no 
chance to clean up its "own" sub-tasks and handle 
possible exceptions that occur during the sub-task clean up. It prevents a 
graceful shutdown.

There is no way to prevent the current printing of the "unhandled" exeption, 
even though the sub-task exception was catched by the main task. (See example 
below)


-- Current behavior --

When asyncio.run() receives an (unhanded) exception, all tasks are cancelled 
simultaneously. 

If any task generates an exception during its clean-up phase this is printed to 
the log, even though this exception is handled by the main task.


-- Expected behavior --

asyncio.run() should first cancel the main task, wait for it to complete its 
shutdown (and possible cancel its own sub-tasks, with exception catching), and 
*afterwards* cancel the remaining tasks.


-- Example Code --

For instance realize a graceful shutdown of a webserver when SIGTERM signal 
handler raises a SystemExit exception.




import os
import asyncio
import logging


async def main():

logging.basicConfig(level=logging.INFO)

async def sub_task():
logging.info('sub_task: enter')
try:
while True:
await asyncio.sleep(1)
logging.info('some_task: action')
finally:
logging.info('sub_task: cleanup')
await asyncio.sleep(3)
logging.info('sub_task: cleanup generates exception')
raise ValueError()
logging.info('sub_task: cleanup end')

task = asyncio.create_task(sub_task())
 
try:
while True:
await asyncio.sleep(1)
except Exception as e:
logging.info(f"Main: exception {repr(e)} received: something went 
wrong: cancelling sub-task")
task.cancel()
finally:
logging.info("Main: cleanup")
try:
await task
except Exception as e:
logging.info(f"Main: catched exception {repr(e)} from await 
sub_task")

try:
asyncio.run( main() )
except KeyboardInterrupt:
pass

-- Script Output with Ctrl+C manually generating an KeyboardInterrupt exception 
--


INFO:root:sub_task: enter
INFO:root:some_task: action
<--- CtrlC pressed here
INFO:root:Main: exception CancelledError() received: something went wrong: 
cancelling sub-task
INFO:root:Main: cleanup
INFO:root:sub_task: cleanup
INFO:root:sub_task: cleanup generates exception
INFO:root:Main: catched exception ValueError() from await sub_task
ERROR:asyncio:unhandled exception during asyncio.run() shutdown
task: .sub_task() done, defined at 
D:\Benutzer\projekte\iep\apps\data_player\_signals_test\test.py:10> 
exception=ValueError()>
Traceback (most recent call last):
  File 
"C:\Users\z0013xar\AppData\Local\Continuum\anaconda3\lib\asyncio\runners.py", 
line 43, in run
return loop.run_until_complete(main)
  File 
"C:\Users\z0013xar\AppData\Local\Continuum\anaconda3\lib\asyncio\base_events.py",
 line 574, in run_until_complete
self.run_forever()
  File 
"C:\Users\z0013xar\AppData\Local\Continuum\anaconda3\lib\asyncio\base_events.py",
 line 541, in run_forever
self._run_once()
  File 
"C:\Users\z0013xar\AppData\Local\Continuum\anaconda3\lib\asyncio\base_events.py",
 line 1750, in _run_once
event_list = self._selector.select(timeout)
  File "C:\Users\z0013xar\AppData\Local\Continuum\anaconda3\lib\selectors.py", 
line 323, in select
r, w, _ = self._select(self._readers, self._writers, [], timeout)
  File "C:\Users\z0013xar\AppData\Local\Continuum\anaconda3\lib\selectors.py", 
line 314, in _select
r, w, x = select.select(r, w, w, timeout)
KeyboardInterrupt

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "D:\Benutzer\projekte\iep\apps\data_player\_signals_test\test.py", line 
14, in sub_task
await asyncio.sleep(1)
  File 
"C:\Users\z0013xar\AppData\Local\Continuum\anaconda3\lib\asyncio\tasks.py", 
line 595, in sleep
return await future
concurrent.futures._base.CancelledError

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "D:\Benutzer\projekte\iep\apps\data_player\_signals_test\test.py", line 
34, in main
await task
  File "D:\Benutzer\projekte\iep\apps\data_player\_signals_test\test.py", line 
20, in sub_task
raise ValueError()
ValueError

-- Expected Output --

Same as above but without

  "ERROR:asyncio:unhandled exception during asyncio.run() shutdown"

and following traceback

--
components: asyncio
messages: 398638
nosy: andreash, asvetlov, ysel

[issue45184] Add `pop` function to remove context manager from (Async)ExitStack

2021-09-13 Thread Andreas H.


New submission from Andreas H. :

Currently it is not possible to remove context managers from an ExitStack (or 
AsyncExitStack). 


Workarounds are difficult and generally do accesses implementation details of 
(Async)ExitStack.
See e.g. https://stackoverflow.com/a/37607405. It could be done as follows:


class AsyncExitStackWithPop(contextlib.AsyncExitStack):
"""Same as AsyncExitStack but with pop, i.e. removal functionality"""
async def pop(self, cm):
callbacks = self._exit_callbacks
self._exit_callbacks = collections.deque()
found = None
while callbacks:
cb = callbacks.popleft()
if cb[1].__self__ == cm:
found = cb
else:
self._exit_callbacks.append(cb)
if not found:
raise KeyError("context manager not found")
if found[0]:
return found[1](None,None,None)
else:
return await found[1](None, None, None)

The alternative is re-implementation of ExitStack with pop functionality, but 
that is also very difficult to get right (especially with exceptions). Which is 
probably the reason why there is ExitStack in the library at all.


So I propose to augment (Async)ExitStack with a `pop` method like above or 
similar to the above.


Use-Cases:

An example is a component that manages several connections to network services. 
During run-time the network services might need to change (i.e. some be 
disconnected and some be connected according to business logic), or handle 
re-connection events (ie. graceful response to network errors).
It is not too hard to imagine more use cases.
Essentially every case where dynamic resource management is needed and where 
single resources are managable with python context managers.

--
components: Library (Lib)
messages: 401703
nosy: andreash, ncoghlan, yselivanov
priority: normal
severity: normal
status: open
title: Add `pop` function to remove context manager from (Async)ExitStack
type: enhancement
versions: Python 3.10, Python 3.11, Python 3.6, Python 3.7, Python 3.8, Python 
3.9

___
Python tracker 
<https://bugs.python.org/issue45184>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue45184] Add `pop` function to remove context manager from (Async)ExitStack

2021-09-13 Thread Andreas H.


Andreas H.  added the comment:

I see your point. But even with `pop` or `remove` it is still a stack or 
stack-like. In the normal case the context managers are still released in 
reverse order as they were added. Order cannot be changed arbitrarily.

There is just the additional function of removing a single context manager 
prematurely(e.g. for graceful error recovery and such). 

I would perhaps say that a stack is the "wrong" solution to the problem of 
"programmatically combining context managers" [this is from the official 
documentaion] in the first place. I write wrong in quotes because it is of 
course not really wrong, as one wants the reverse exit order. But to adequately 
address the dynamic case one needs in my opinion the ability to prematurely 
remove context managers. Otherwise the use is limited.

Reimplemeting the desired functionality with dicts or lists does not seem 
appealing to me as the code will be 90% the same to ExitStack. It will then 
also make ExitStack obsolete. So why not integrate it there?

The unsymmetry of being able to add context managers but not being able to 
remove them also seems odd to me.

--

___
Python tracker 
<https://bugs.python.org/issue45184>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue45462] Speed up re.match with pre-compiled patterns

2021-10-13 Thread Jonas H.


New submission from Jonas H. :

re.match(p, ...) with a pre-compiled pattern p = re.compile(...) can be much 
slower than calling p.match(...). Probably mostly in cases with "easy" patterns 
and/or short strings.

The culprit is that re.match -> re._compile can spend a lot of time looking up 
p its internal _cache, where it will never find p:

def _compile(pattern, flags):
...
try:
return _cache[type(pattern), pattern, flags]
except KeyError:
pass
if isinstance(pattern, Pattern):
...
return pattern
...
_cache[type(pattern), pattern, flags] = p
...

_compile will always return before the _cache is set if given a Pattern object.

By simply reordering the isinstance(..., Pattern) check we can safe a lot of 
time.

I've seen speedups in the range of 2x-5x on some of my data. As an example:

Raw speed of re.compile(p, ...).match():
time ./python.exe -c 'import re'\n'pat = re.compile(".").match'\n'for _ in 
range(1_000_000): pat("asdf")'
Executed in  190.59 millis

Speed with this optimization:
time ./python.exe -c 'import re'\n'pat = re.compile(".")'\n'for _ in 
range(1_000_000): re.match(pat, "asdf")'
Executed in  291.39 millis

Speed without this optimization:
time ./python.exe -c 'import re'\n'pat = re.compile(".")'\n'for _ in 
range(1_000_000): re.match(pat, "asdf")'
Executed in  554.42 millis

--
components: Regular Expressions
messages: 403851
nosy: ezio.melotti, jonash, mrabarnett
priority: normal
severity: normal
status: open
title: Speed up re.match with pre-compiled patterns
type: performance
versions: Python 3.11

___
Python tracker 
<https://bugs.python.org/issue45462>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue45462] Speed up re.match with pre-compiled patterns

2021-10-13 Thread Jonas H.


Change by Jonas H. :


--
keywords: +patch
pull_requests: +27224
stage:  -> patch review
pull_request: https://github.com/python/cpython/pull/28936

___
Python tracker 
<https://bugs.python.org/issue45462>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue45462] Speed up re.match with pre-compiled patterns

2021-10-15 Thread Jonas H.


Jonas H.  added the comment:

I agree with your statement in principle. Here are numbers for the slowdown 
that's introduced:

Without the change:
  ./python.exe -m timeit -s 'import re'\n'[re.compile(f"fill_cache{i}") for i 
in range(512)]'\n'pat = re.compile(".")' 're.match(pat, "asdf")'
  50 loops, best of 5: 462 nsec per loop
  ./python.exe -m timeit -s 'import re'\n'[re.compile(f"fill_cache{i}") for i 
in range(512)]'\n'pat = re.compile(".")' 're.match(".", "asdf")'
  100 loops, best of 5: 316 nsec per loop

With the change:
  ./python.exe -m timeit -s 'import re'\n'[re.compile(f"fill_cache{i}") for i 
in range(512)]'\n'pat = re.compile(".")' 're.match(pat, "asdf")'
100 loops, best of 5: 207 nsec per loop
  ./python.exe -m timeit -s 'import re'\n'[re.compile(f"fill_cache{i}") for i 
in range(512)]'\n'pat = re.compile(".")' 're.match(".", "asdf")'
100 loops, best of 5: 351 nsec per loop

So we have a 2x speedup in the uncommon case and a 10% slowdown in the common 
case.

--

___
Python tracker 
<https://bugs.python.org/issue45462>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue45462] Speed up re.match with pre-compiled patterns

2021-10-15 Thread Jonas H.


Jonas H.  added the comment:

pat.match() has 110 nsec.

Feel free to close the issue and PR if you think this isn't worth changing.

--

___
Python tracker 
<https://bugs.python.org/issue45462>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue44556] ctypes unittest crashes with libffi 3.4.2

2021-11-19 Thread h-vetinari


Change by h-vetinari :


--
nosy: +h-vetinari

___
Python tracker 
<https://bugs.python.org/issue44556>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue38779] Simple typo in strings module documentation

2019-11-12 Thread Michael H


New submission from Michael H :

https://docs.python.org/3/tutorial/introduction.html#strings

In the strings part of the basic tutorial, there is an output error regarding 
the escaping of the single quote

>>> '"Isn\'t," they said.'
'"Isn\'t," they said.' # I think the output should be correct

Thanks

--
assignee: docs@python
components: Documentation
messages: 356461
nosy: Michael H2, docs@python
priority: normal
severity: normal
status: open
title: Simple typo in strings module documentation
type: compile error
versions: Python 3.9

___
Python tracker 
<https://bugs.python.org/issue38779>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue38779] Simple typo in strings module documentation

2019-11-12 Thread Michael H


Michael H  added the comment:

Sorry, its my bad, it is correct as it is, I hadn't read further on about the 
print statement being needed. As I am working through the tutorial in pycharm, 
I am had already used print statement.

Thanks!

--

___
Python tracker 
<https://bugs.python.org/issue38779>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue38779] Simple typo in strings module documentation

2019-11-12 Thread Michael H


Michael H  added the comment:

Many thanks!

--

___
Python tracker 
<https://bugs.python.org/issue38779>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue42380] Build windows binaries with MS VS2019 16.8+ / MSVC 19.28+

2021-01-29 Thread h-vetinari


h-vetinari  added the comment:

Hey Terry

I had asked about this on discuss 
(https://discuss.python.org/t/toolchain-upgrade-on-windows/6377/2), and Steve 
provided some very valuable input.

In particular, building with the newer VS (that supports C11) should stay 
ABI-compatible with everything that has been built on Visual Studio 2015, 2017 
and 2019:
> This is different from all previous Visual C++ versions, as they each had 
> their own distinct runtime files, not shared with other versions.

(from 
https://docs.microsoft.com/en-gb/cpp/windows/universal-crt-deployment?view=msvc-160&viewFallbackFrom=vs-2019),
 due to the way the (now-)universal runtime is designed.

Thanks

--

___
Python tracker 
<https://bugs.python.org/issue42380>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue42380] Build windows binaries with MS VS2019 16.8+ / MSVC 19.28+

2021-01-29 Thread h-vetinari


h-vetinari  added the comment:

PS.
> Judging from the link you posted to version numbering
https://en.wikipedia.org/wiki/Microsoft_Visual_C%2B%2B#Internal_version_numbering
 the first line should have 'MSVC 14.28' the middle column title should be 'MS 
Visual Studio'.

The wiki page was refactored quite extensively it seems, this is what I had 
been referring to: 
https://en.wikipedia.org/w/index.php?title=Microsoft_Visual_C%2B%2B&oldid=997067123.
 But I'll happily admit that I don't understand the reasons behind (the 
differences between) the various version numbers: MSVC++, _MSC_VER, etc.

--

___
Python tracker 
<https://bugs.python.org/issue42380>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue42380] Build windows binaries with MS VS2019 16.8+ / MSVC 19.28+

2021-01-29 Thread h-vetinari


h-vetinari  added the comment:

PPS. Also, the compiler implementation reference uses 19.x for MSVC: 
https://en.cppreference.com/w/cpp/compiler_support, which was the link I was 
trying to make, now that I'm looking at it.

--

___
Python tracker 
<https://bugs.python.org/issue42380>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue43112] SOABI on Linux does not distinguish between GNU libc and musl libc

2021-02-10 Thread h-vetinari


Change by h-vetinari :


--
nosy: +h-vetinari

___
Python tracker 
<https://bugs.python.org/issue43112>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue41030] Provide toList() method on iterators (`list()` is a flow killer in REPL)

2020-06-19 Thread Julien H


Change by Julien H :


--
components: +Library (Lib) -Demos and Tools
versions:  -Python 3.9

___
Python tracker 
<https://bugs.python.org/issue41030>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue41030] Provide toList() method on iterators/generators (`list()` is a flow killer in REPL)

2020-06-19 Thread Julien H


Change by Julien H :


--
title: Provide toList() method on iterators (`list()` is a flow killer in REPL) 
-> Provide toList() method on iterators/generators (`list()` is a flow killer 
in REPL)

___
Python tracker 
<https://bugs.python.org/issue41030>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue41030] Provide toList() method on iterators/generators (`list()` is a flow killer in REPL)

2020-06-19 Thread Julien H


Julien H  added the comment:

Hello Ammar Askar,

I agree `_` avoids the "up arrow" problem I mentioned in the REPL. 

I actually primarily use jupyter notebooks in my work.

Point 1. in my first message is the primary issue. Having to edit the line in 
two places to perform one action has driven me crazy over and over.

--

___
Python tracker 
<https://bugs.python.org/issue41030>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue41107] Running a generator in a map-like manner

2020-06-24 Thread Natsumi H.


New submission from Natsumi H. :

I suggest adding a function which behaves like map but without returning 
anything to iterate over a generator.

This is useful in cases where you need to run a function on every element in a 
list without unnecessarily creating a generator object like map would. 

I think given the existence of the map function that this should be added to 
Python.

--
components: Interpreter Core
messages: 372275
nosy: natsuwumi
priority: normal
severity: normal
status: open
title: Running a generator in a map-like manner
type: enhancement
versions: Python 3.10

___
Python tracker 
<https://bugs.python.org/issue41107>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue41107] Running a generator in a map-like manner

2020-06-24 Thread Natsumi H.


Natsumi H.  added the comment:

Exactly that was the plan!

--

___
Python tracker 
<https://bugs.python.org/issue41107>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue41107] Running a generator in a map-like manner

2020-06-24 Thread Natsumi H.


Natsumi H.  added the comment:

If it won't be added do you reckon creating a library to solve this issue would 
be appropriate?

--

___
Python tracker 
<https://bugs.python.org/issue41107>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue42356] Dict inline manipulations

2020-11-14 Thread Tomek H


New submission from Tomek H :

With Python3.9 there is a great feature for merging `dict`s:
{1: 'a'} | {2: 'b'} => {1: 'a', 2: 'b'}


It would be very handy to filter out a dict with a similar fashion (for example 
& operator with a list/tuple/frozenset of keys you want to get back):
{1: 'a', 2: 'b', 3: 'c'} & [1, 3, 4] == {1: 'a', 3: 'c'}
{1: 'a', 2: 'b', 3: 'c'} & {1, 3, 4} == {1: 'a', 3: 'c'}


Also, omitting specified keys (for example - operator with a 
list/tuple/frozenset of keys you want to suppress):
{1: 'a', 2: 'b', 3: 'c'} - [3, 4] == {1: 'a', 2: 'b'}
{1: 'a', 2: 'b', 3: 'c'} - {3, 4} == {1: 'a', 2: 'b'}


Regards!

--
components: Interpreter Core
messages: 380972
nosy: tomek.hlawiczka
priority: normal
severity: normal
status: open
title: Dict inline manipulations
type: enhancement
versions: Python 3.10

___
Python tracker 
<https://bugs.python.org/issue42356>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue42380] Build windows binaries with MS VS2019 16.8+ / MSVC 19.28+

2020-11-16 Thread h-vetinari


New submission from h-vetinari :

While Visual Studio 16.8 (<-> MSVC 19.28) has _just_ been released, I think it 
would be worthwhile to consider upgrading the compiler toolchain that's used to 
build the CPython windows binaries, particularly before the release of 3.10.

That's because many libraries (e.g. numpy/scipy) are stuck to the same 
compilers as CPython for ABI-compatibility, and generally, MSVC is by far the 
lowest common denominator in terms of C/C++ compliance, cf. 
https://github.com/scipy/scipy/blob/master/doc/source/toolchain.rst

For example, dropping python 3.6 support in scipy should finally enable them to 
use C++14/C++17, since python 3.7+ is built with Visual Studio 15.7, which has 
essentially complete support, cf. 
https://en.cppreference.com/w/cpp/compiler_support & 
https://en.wikipedia.org/wiki/Microsoft_Visual_C%2B%2B#Internal_version_numbering.

However (& as far as I can tell), the windows compiler version for CPython 
hasn't moved since the release of 3.7, cf. 
https://pythondev.readthedocs.io/windows.html#python-and-visual-studio-version-matrix
 (I know that's not an official page, but vstinner can hardly be considered a 
questionable source), and every release without upgrading the toolchain means 
another year of waiting for the ecosystem to unlock more modern C/C++.

The reason why Visual Studio 16.8 is particularly interesting, is that MS has 
for a very long time not paid attention to C compliance, and only recently 
completed C99 support, with C11/C17 following in 16.8 (though as of yet without 
optional aspects of the standard like atomics, threading, VLAs, complex types, 
etc.), cf. 
https://devblogs.microsoft.com/cppblog/c11-and-c17-standard-support-arriving-in-msvc/.

Looking at the table from 
https://github.com/scipy/scipy/blob/master/doc/source/toolchain.rst, it would 
be cool if we could add the last line as follows
===   ==   ===
CPython   MS Visual C++C Standard
===   ==   ===
2.7, 3.0, 3.1, 3.2   9.0   C90
3.3, 3.410.0   C90 & some of C99
3.5, 3.614.0   C90 & most of C99
3.7 15.7   C90 & most of C99
3.8 15.7   C90 & most of C99
3.9 15.7   C90 & most of C99
3.1016.8   C99, C11*, C17
===   ==   ===
* [comment about lack of C11 optionals]

--
components: Windows
messages: 381167
nosy: h-vetinari, paul.moore, steve.dower, tim.golden, zach.ware
priority: normal
severity: normal
status: open
title: Build windows binaries with MS VS2019 16.8+ / MSVC 19.28+
type: enhancement
versions: Python 3.10

___
Python tracker 
<https://bugs.python.org/issue42380>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue14489] repr() function link on the built-in function documentation is incorrect

2012-04-03 Thread H Xu

New submission from H Xu :

The `repr()` built-in function link  in this page [ 
http://docs.python.org/library/functions.html ] should link to the built-in 
version of `repr()`.
It should link to: http://docs.python.org/library/functions.html#repr

However, it links to here: http://docs.python.org/library/repr.html#module-repr

--
assignee: docs@python
components: Documentation
messages: 157462
nosy: H.Xu, docs@python
priority: normal
severity: normal
status: open
title: repr() function link on the built-in function documentation is incorrect
versions: Python 2.7

___
Python tracker 
<http://bugs.python.org/issue14489>
___
___
Python-bugs-list mailing list
Unsubscribe: 
http://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue1615158] POSIX capabilities support

2017-06-14 Thread Christian H

Changes by Christian H :


--
nosy: +Christian H

___
Python tracker 
<http://bugs.python.org/issue1615158>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue22536] subprocess should include filename in FileNotFoundError exception

2017-08-23 Thread Christian H

Christian H added the comment:

I was also bitten by this bug, and would like to see it merged. The patch 
22536-subprocess-exception-filename-2.patch looks fine to me.

--
nosy: +Christian H

___
Python tracker 
<http://bugs.python.org/issue22536>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue14156] argparse.FileType for '-' doesn't work for a mode of 'rb'

2017-05-26 Thread Marcel H

Marcel H added the comment:

I want to see this fixed in python3.x as well, please :) the patch should be 
the same

--
nosy: +Marcel H2
versions: +Python 3.6, Python 3.7

___
Python tracker 
<http://bugs.python.org/issue14156>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue32679] concurrent.futures should store full sys.exc_info()

2018-01-26 Thread Jonas H.

New submission from Jonas H. :

Use case: Try to get a future's result using 
concurrent.futures.Future.result(), and log the full exception if there was any.

Currently, only "excinst" (sys.exc_info()[1]) is provided with the 
Future.exception() method.

Proposal: Add new Future.exc_info() method that returns the full sys.exc_info() 
at the time of the exception.

--
components: Library (Lib)
messages: 310762
nosy: jonash
priority: normal
severity: normal
status: open
title: concurrent.futures should store full sys.exc_info()
type: enhancement
versions: Python 3.4, Python 3.5, Python 3.6, Python 3.7, Python 3.8

___
Python tracker 
<https://bugs.python.org/issue32679>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue32679] concurrent.futures should store full sys.exc_info()

2018-01-26 Thread Jonas H.

Jonas H.  added the comment:

See also 
https://stackoverflow.com/questions/19309514/getting-original-line-number-for-exception-in-concurrent-futures
 for other people having the same problem

--

___
Python tracker 
<https://bugs.python.org/issue32679>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue31526] Allow setting timestamp in gzip-compressed tarfiles

2017-11-08 Thread Jonas H.

Jonas H.  added the comment:

This affects me too.

--
nosy: +jonash

___
Python tracker 
<https://bugs.python.org/issue31526>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue32071] Add py.test-like "-k" test selection to unittest

2017-11-18 Thread Jonas H.

New submission from Jonas H. :

I'd like to add test selection based on parts of the test class/method name to 
unittest. Similar to py.test's "-k" option: 
https://docs.pytest.org/en/latest/example/markers.html#using-k-expr-to-select-tests-based-on-their-name

Here's a proof of concept implementation: 
https://github.com/jonashaag/cpython/compare/master...unittest-select

Is this something others find useful as well? If so, I'd like to work on 
getting this into Python stdlib proper. This is my first time contributing to 
the unittest framework; is the general approach taken in my PoC implementation 
correct in terms of abstractions? How can I improve the implementation?

Jonas

--
components: Library (Lib)
messages: 306490
nosy: jonash
priority: normal
severity: normal
status: open
title: Add py.test-like "-k" test selection to unittest
type: enhancement

___
Python tracker 
<https://bugs.python.org/issue32071>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue32071] Add py.test-like "-k" test selection to unittest

2017-11-18 Thread Jonas H.

Jonas H.  added the comment:

Just to be clear, the current implementation is limited to substring matches. 
It doesn't support py.test like "and/or" combinators. (Actually, py.test uses 
'eval' to support arbitrary patterns.)

So say we have test case

SomeClass
test_foo
test_bar

Then

- python -m unittest -k fo matches "test_foo"
- python -m unittest -k Some matches "test_foo" and "test_bar"
- python -m unittest -k some matches nothing

The -k option may be used multiple times, combining the patterns with "or":

- python -m unittest -k fo -k b matches "test_foo" and "test_bar"

It's also possible to use glob-style patterns, like -k "spam_*_eggs".

--

___
Python tracker 
<https://bugs.python.org/issue32071>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue32071] Add py.test-like "-k" test selection to unittest

2017-11-20 Thread Jonas H.

Jonas H.  added the comment:

Thanks Antoine. I will need some guidance as to what are the correct places to 
make these changes. I'm not quite sure about the abstractions here (runner, 
loader, suite, case, etc.)

My PoC (see GitHub link in first post) uses a TestSuite subclass. (The subclass 
is only so that it's easier to assess the general implementation approach; I 
guess it should be put into the main class instead.)

Things I'm unsure of:

1) Is suite the correct place for this kind of feature?
2) Is the hardcoded fnmatch-based pattern matcher ok, or do we need a new 
abstraction "NameMatcher"?
3) Is the approach of dynamically wrapping 'skip()' around to-be-skipped test 
cases OK?
4) The try...catch statement around 'test.id()' is needed because there are 
some unit tests (unit tests for the unittest module itself) that check for some 
error cases/error handling in the unittest framework, and crash if we try to 
call '.id()' on them. Please remove the try...catch to see these errors if 
you're interested in the details. Is the check OK like that, or is this a code 
smell?

Thanks
Jonas

--

___
Python tracker 
<https://bugs.python.org/issue32071>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue32071] Add py.test-like "-k" test selection to unittest

2017-11-20 Thread Jonas H.

Jonas H.  added the comment:

> > 3) Is the approach of dynamically wrapping 'skip()' around to-be-skipped 
> > test cases OK?

> I think this is the wrong approach.  A test that isn't selected shouldn't be 
> skipped, it should not appear in the output at all.  Another reason for 
> putting this in TestLoader :-)

My first implementation actually was mostly the test loader. Two things made me 
change my mind and try to make the changes in the suite code:

- The loader code really only deals with loading (i.e., finding + importing) 
tests. Yes it expects a file pattern like "test*.py" for identifying test case 
files. But apart from that it didn't "feel" right to put name based selection 
there.
- In py.test you'll get a console output like "5 tests passed, 1 test failed, 
12 tests deselected". We can't get anything similar without making bigger 
changes to the test loader, runner, etc. code. Using skip() we at least have 
some info on "skipped" tests, although they're technically not skipped.

Are you still saying this should go to the test loader?

--

___
Python tracker 
<https://bugs.python.org/issue32071>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue32071] Add py.test-like "-k" test selection to unittest

2017-11-21 Thread Jonas H.

Jonas H.  added the comment:

Interesting, Victor. I've had a look at the code you mentioned, but I'm afraid 
it doesn't really make sense to re-use any of the code.

Here's a new patch, implemented in the loader as suggested by Antoine, and with 
tests.

I'm happy to write documentation etc. once we're through with code review.

https://github.com/python/cpython/pull/4496

--
keywords: +patch
pull_requests: +4433
stage: needs patch -> patch review

___
Python tracker 
<https://bugs.python.org/issue32071>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue32071] Add py.test-like "-k" test selection to unittest

2017-11-27 Thread Jonas H.

Jonas H.  added the comment:

Sure!

--

___
Python tracker 
<https://bugs.python.org/issue32071>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



[issue32071] Add py.test-like "-k" test selection to unittest

2017-11-27 Thread Jonas H.

Jonas H.  added the comment:

Ah, the problem isn't that it's running getattr() on test methods, but that it 
runs getattr() on all methods.

Former code: attrname.startswith(prefix) and \
callable(getattr(testCaseClass, attrname))

New code: testFunc = getattr(testCaseClass, attrname)
isTestMethod = attrname.startswith(self.testMethodPrefix) and 
callable(testFunc)

This is trivial to fix. @Core devs: Should I revert to original behaviour with 
the order of the prefix check and the getattr() call, and add a regression test 
that guarantees this behaviour?

--

___
Python tracker 
<https://bugs.python.org/issue32071>
___
___
Python-bugs-list mailing list
Unsubscribe: 
https://mail.python.org/mailman/options/python-bugs-list/archive%40mail-archive.com



  1   2   >