Re: Mental model of lookahead assertions
On 2020-02-27, Stefan Ram wrote: > One can count overlapping occurences as follows. > >|>>> print(len(findall('(?=aa)','cb'))) >|3 > > Every web page says that lookahead assertions do > not consume nor move the "current position". > > But what mental model can I make of the regex > engine that explains why it is not caught in an > endless loop matching "aa" at the same position > again and again and never advancing to the other > occurences? Simply that when it's returning multiple matches it knows once it's found a match at position to start looking for the next potential match at position ? -- https://mail.python.org/mailman/listinfo/python-list
wxFormBuilder
The videos on YouTube by fandangleproductions, have been of some use. However as I also use the Geany editor, I find that I need to append this extra code to the Python script generated from wxFormBuilder. # - extra code --- class MyApp(wx.App): def OnInit(self): self.frame = frameMain(None) self.SetTopWindow(self.frame) self.frame.Show() return True # end of class MyApp if __name__ == "__main__": app = MyApp(0) app.MainLoop() Where frameMain is the name of this particular frame. This may help you with your Python coding. -- https://mail.python.org/mailman/listinfo/python-list
Friday Finking: Poly more thick
How does one code a function/method signature so that it will accept either a set of key-value pairs, or the same data enclosed as a dict, as part of a general-case and polymorphic solution? Wikipedia: polymorphism is the provision of a single interface to entities of different types. ( https://en.wikipedia.org/wiki/Polymorphism_(computer_science) ) At the end of a code-sprint, one of 'my' teams was presenting. Everything went well. Fatefully, the client-manager then remembered that the latest work would extend a previous sprint's work. Despite it not being a deliverable (?), he asked to be shown the two working together. The team, flushed with pride at their 'success' (foolishly) agreed, and the attempt immediately crashed. (as such live-demos are wont to do!) Wisely, a 'coffee/tea break' was suggested, to give the team time to 'fix' things. Hence a panic-call to me, working from home. It turned-out that a new-ish and less-experienced coder had been given this job, and she had been shown the interface as key-value pairs (her interpretation) - and even on the white-board there were no braces to show it as a dict! Whereas the more experienced programmer who originally assembled the class/signature/interface, had used a dict. Unhappiness was evident, argument threatened... I quickly mocked-up the situation. (code below). The fastest solution, which didn't require altering any calling-code, seemed to be to change the class's signature to use *args and **kwargs, which thereafter required only a single 'normalisation' step to turn any/the kwargs into a dict - which in-turn enabled the existing code, unchanged. Quick and dirty to be sure! However, it allowed the demo to go ahead and recovered everyone's feelings of satisfaction/success! Aside from 'repairing' team spirit, (as regular readers will recognise) I wasn't going to let this pass without some thought and discussion (and perhaps I might learn something about Python interfacing)! Python's powerful polymorphic capabilities [insert expression of thanks here!] allow us to substitute collections - with care. So, a simple function, such as:) def f( colln ): for element in colln: print( element ) will work quite happily when (separately) fed either a tuple or a list. It will also work with a string (a collection of characters) - as long as we are happy with single characters being output. What about dicts as collections? It will work with dicts, as long as we consider the dict's keys to be 'the dict' (cf its collection of values). Well that's very flexible! However, whilst a set of key-value arguments 'quack' somewhat like a dict, or even a tuple (of k-v pairs), they will not be accepted as parameters. Crash! Yes it can be solved, with a 'normalisation' function, as described above, perhaps as a working-theory: def f( *colln_as_tuple, **key_value_pairs_as_dict ): my_dict = key_value_pairs_as_dict if key_value_pairs_as_dict \ else colln_as_tuple[ 0 ] # no change to existing code using my_dict ... However, this seems like a situation that "quacks like a duck", yet the built-in polymorphism doesn't auto-magically extend to cover it. So: - is that the way 'polymorphism' really works/should work? - is expecting the dict to work as a "collection", already 'taking things too far'? - is expecting the key-values to work, really 'taking things too far'? Remember that I am not an OOP-native! The discussion around-the-office diverged into two topics: (1) theoretical: the limits and capability of polymorphism, and (2) practical: the understanding and limits of a Python "collection". How do you see it? How would you solve the Python coding problem - without re-writing both sprints-worth of code? Did we leave polymorphism 'behind' and expect Python to perform magic? ### investigative and prototyping code ### def print_dict( dictionary ): # helper function only for key, value in dictionary.items(): print( "\t", key, value ) def f( **kwargs ): print( kwargs, type( kwargs ) ) print_dict( kwargs ) f( a=1, b=2 ) # arguments as key-value pairs ### {'a': 1, 'b': 2} ### a 1 ### b 2 d={ 'a':1, 'b':2 } # arguments as a dictionary try: f( d ) except TypeError: print( "N/A" ) ### N/A ### Traceback (most recent call last): ### File "", line 1, in ### TypeError: f() takes 0 positional arguments but 1 was given f( **d )# yes, easy to edit arguments, but... ### {'a': 1, 'b': 2} ### a 1 ### b 2 try: f( { 'a':1, 'b':2 } ) # or modify API to accept single dict except TypeError: print( "N/A" ) ### N/A print( "\nwith g()\n" ) ### with g() def g( *args, **kwargs ): print( args, type( args ), kwargs, type( kwargs ) ) for arg in args:
【Regarding Performance of a Python Script....】
Hello there, I have a question regarding a simple code snippet in Python: from subprocess import check_output for i in range(1024): check_output(['/bin/bash', '-c', 'echo 42'], close_fds=True) *I wonder why running it in Python 3.7 is much faster than Python 2.7? * (Python 3.7 is still faster, after I used *xrange * in Python 2.7) Thanks all! -- https://mail.python.org/mailman/listinfo/python-list
Re: Data model and attribute resolution in subclasses
On 2/28/2020 2:21 AM, Adam Preble wrote: I have been making some progress on my custom interpreter project but I found I have totally blown implementing proper subclassing in the data model. What I have right now is PyClass defining what a PyObject is. When I make a PyObject from a PyClass, the PyObject sets up a __dict__ that is used for attribute lookup. When I realized I needed to worry about looking up parent namespace stuff, this fell apart because my PyClass had no real notion of a namespace. I'm looking at the Python data model for inspiration. While I don't have to implement the full specifications, it helps me where I don't have an alternative. However, the data model is definitely a programmer document; it's one of those things where the prose is being very precise in what it's saying and that can foil a casual reading. Here's what I think is supposed to exist: 1. PyObject is the base. 2. It has an "internal dictionary." This isn't exposed as __dict__ The internal mapping *is* visible. >>> object.__dict__ mappingproxy({'__repr__': , '__hash__': , '__str__': , '__getattribute__': wrapper '__getattribute__' of 'object' objects>, '__setattr__': wrapper '__setattr__' of 'object' objects>, '__delattr__': '__delattr__' of 'object' objects>, '__lt__': 'object' objects>, '__le__': objects>, '__eq__': , '__ne__': , '__gt__': wrapper '__gt__' of 'object' objects>, '__ge__': of 'object' objects>, '__init__': objects>, '__new__': 0x7FFBEFE989A0>, '__reduce_ex__': 'object' objects>, '__reduce__': objects>, '__subclasshook__': objects>, '__init_subclass__': objects>, '__format__': , '__sizeof__': , '__dir__': , '__class__': '__class__' of 'object' objects>, '__doc__': 'The base class of the class hierarchy.\n\nWhen called, it accepts no arguments and returns a new featureless\ninstance that has no instance attributes and cannot be given any.\n'}) The internal mapping is not an instance of dict, and need/should not be as it is frozen. >>> o = object() >>> o.a = 3 Traceback (most recent call last): File "", line 1, in o.a = 3 AttributeError: 'object' object has no attribute 'a' When classes and objects do have a dict __dict__, __dict__ is not in __dict__, to avoid recursion. gettattribute or __getattribute__ must special case '__dict__'. I am guessing this as what must be from the evidence, without seeing the actual code. 3. PyClass subclasses PyObject. 4. PyClass has a __dict__ Is there a term for PyObject's internal dictionary. It wasn't called __dict__ and I think that's for good reasons. I guess the idea is a PyObject doesn't have a namespace, but a PyClass does (?). Now to look something up. I assume that __getattribute__ is supposed to do something like: 1. The PyClass __dict__ for the given PyObject is consulted. 2. The implementation for __getattribute__ for the PyObject will default to looking into the "internal dictionary." 3. Assuming the attribute is not found, the subclasses are then consulted using the subclass' __getattribute__ calls. We might recurse on this. There's probably some trivia here regarding multiple inheritance; I'm not entirely concerned (yet). For non-reserved names Attribute lookup starts with the object dict, then object class, then superclasses. Dunder names usually start with the object class. 4. Assuming it's never found, then the user sees an AttributeError Would each of these failed lookups result in an AttributeError? I presume so. They are caught and only re-raised only if there is no where else to look. -- Terry Jan Reedy -- https://mail.python.org/mailman/listinfo/python-list
Help building python application from source
sorry re posting because I forgot subject line in last email. I am a python noob. This is why I ask the python masters. There is a python software I want to install on the server it is called Electrumx. https://github.com/kyuupichan/electrumx is the link. I am having troubles with installing this. The short version is I am wanting to build this python application and needed dependencies from source code all from a local directory without relying on the python pip package servers. I only run software I can compile from source code because this is the only way to trust open source software. I also want to archive the software I use and be able to install it on systems in a grid down situation without relying on other servers such as python package servers. Here is a snippet from https://github.com/lee-chiffre/Announcements/blob/master/02.26.2020 that describes what I am trying to do > 1. I need a way I can download the source of ElectrumX and the dependency tree. 2. In a way that also verifies the integrity of those downloads. I dont think many of these python packages are even signed. 3. Then to be able to build ElectrumX and the needed dependencies from source on a computer that does not have connection to internet. Node2.0 does not connect to internet except for the Tor process. Torsocks is not an option here. I will be running ElectrumX and the python dependencies in a venv virtual environment. If I cannot build ElectrumX and the dependencies for it from source in a way that also verifies the integrity of the downloads then for security reasons I will not run it. If someone has a solution to this I will then run ElectrumX, and give you credit for the help. If you need me to pay bounty for this please reach out to me to negotiate on a price. < Python might be easy to code but simplicity of coding comes at the cost of complexity of the software. With C++ I usually only have only a few dependencies. With python it seems like it is almost 20 dependencies. With Electrumx I counted at least 15 dependencies in the dependency tree. Is it possible to download then build this from source with all needed dependencies? -- lee.chif...@secmail.pro PGP 97F0C3AE985A191DA0556BCAA82529E2025BDE35 -- https://mail.python.org/mailman/listinfo/python-list
Re: Asyncio question (rmlibre)
On 2020-02-28 1:37 AM, rmli...@riseup.net wrote: > What resources are you trying to conserve? > > If you want to try conserving time, you shouldn't have to worry about > starting too many background tasks. That's because asyncio code was > designed to be extremely time efficient at handling large numbers of > concurrent async tasks. > Thanks for the reply. That is exactly what I want, and in an earlier response Greg echoes what what you say here - background tasks are lightweight and are ideal for my situation. Frank -- https://mail.python.org/mailman/listinfo/python-list