Antoon, Just to be clear. I am talking about a different measure of efficiency. If you have code that handles a limited functionality properly it can be quite simple. If you then expand the code to handle ever more situations, then it may do things like a series of IF statements to determine which of many things to do so it now takes a while to just reach the part that does the most common case. I have seen people use code that knows what arguments to expect and just does things simply, like adding two integers together. Then the code is improved so it detects if either argument is floating point and now does any needed conversions. Soon you add support for complex numbers or character strings that look like "23" and then even parse "twelve dollars and forty two cents" in English and then a few other languages. Next you accept range objects and other things that make sense to add and accept an undefined number of arguments and name the darn thing sum_everything() and proudly show it off. It now has an amazing number of errors it can propagate or warn about. But, you can still use it to add 2 and 2.
Now, yes, nobody needs a function to just add two numbers. If that bothers you, make is add the absolute values or something a tad more interesting. But the point is that any code that invokes sum_everything() may now pay a penalty in terms of performance just in the beginning part where it tests how many arguments it got, what types they are, and so on. The topic here is the Python run-time parser though. It is reading your code and doing whatever complex set of things it has to do to parse from a fairly large set of possible valid programs as well as invalid ones. I have never looked deeply at how it works but my guess is that somewhere in there are concepts like: simple_asignment_expression can look like THIS. complex _assignment expression can look like simple_assignment_expression OR THAT OR ... So to parse code you often need to look at alternate ways of connecting symbols and hopefully find the one and only way it has to be looked at. Parentheses as an example have many possible meanings and you may not know which meaning when you encounter it until you keep going and see where there may be a matching one but ignore any within a character string. I won't go on but the point is that the parser can jump through more hoops even in the most usual cases when it has to look for new cases not originally designed in. Your argument that people using other techniques to get the functionality they want is not really relevant as I do not deny it. My point is that the most common ways NORMALLY used are the ones that drive the efficiency of a system. So if adding functionality to include new cases/situations doubles the time it takes to do the most common case and that is used 98% of the time, then how much overall gain for the other 2% is needed to counterbalance it? I find a common pattern in software that often looks like extra layers around a function call. There may be a function to create an object given a character string argument like vector("integer", 1, 2, 3") or vector("character", "a", "b") that lets you create all kinds of vectors. Someone comes along with a bright idea to make programmers instead call make_integer(1, 2, 3) and make_character("a", "b") and more like that. We now have lots of new function that are just going to turn around and call vector() with a different appropriate string as the first argument and pass along the rest. We now have a function calling a second function. Yes, there are many possible advantages here including ways to check if you are using your code as intended. But there is overhead. And in a tight loop repeated millions of times, can you blame a programmer who knows, if they just call vector() directly, or perhaps a deeper function that vector() calls when it knows it is using integers? I will end with this. If someone wants to design a new language from scratch and with a goal of starting with as general a set of concepts as they can, fine. Design it carefully. Build it and if it works well enough, use it. But to ask an existing language to add features or expand existing ones is not at all the same thing and requires much more care. In python, you can find areas that are a bit confusing such as how multiple inheritance in objects is done. It can require some tweaking to make your objects in ways that the right thing is inherited from the other objects the way you want if more than one has the same method and you can have subtle errors. Arguably the darn thing is too general and many other languages instead decide not to support multiple inheritance and may use other interesting ways to get similar functionality. But although this can be a very nice feature allowing you to design quite sophisticated sets of objects that inherit all kinds of nifty powers from other existing objects, it can be a drag on performance if it does a search through a big mess to find the right function to call at run time! Sometimes it may be easier to not use multiple inheritance in some part of your code and use a work-around to get what you want. I am not against extending Python in the direction someone wants. I am FOR careful examination and study before making the change and weighing whether this is likely to be more useful than other things being asked for and other relative costs. Many things turn out not to be needed. I recall programs designed to use every letter of the alphabet (and other symbols) whether needed or not. I mean things like "d" for delete and "a" for add and "t" for transpose. Just to be complete, make up something that "q" or "z" do. Why? Not because anyone wants or needs those. I have seen projects like that then take longer to create and harder to test and the users mostly thought it was too complex and rarely or never used some functionality. I have made macros that allow something in an emacs editor like transpose-letter and transpose-word and continued with sentences, paragraphs, chapters and something beyond like transpose-on-paste-buffers. But most people actually just do a cut and move and paste operation for the more complex scenarios even if they remember the fancy version exists and is bound to some forgotten series of keys clicked together like control-X control-alt-t or something. -----Original Message----- From: Python-list <python-list-bounces+avigross=verizon....@python.org> On Behalf Of Antoon Pardon Sent: Monday, October 25, 2021 5:21 AM To: python-list@python.org Subject: Re: New assignmens ... On 25/10/2021 01:46, Avi Gross via Python-list wrote: > No, many things need not be as general as possible once you consider > how much work it may take to develop code and how many bugs and > oddities might be introduced and even how much it may slow the interpreter. ... > I imagine you can create some fairly complex examples you can suggest > should be handled for generality including some very indirect > references created dynamically. The code to recognize any abstract use > of symbols may not only slow down every operation of even the simplest > type but generate all kinds of error messages nobody will understand, > let alone translate into other languages properly! Right now, it is > simpler. An error message can say that only certain simple usages are allowed. I don't consider this a strong argument. Limiting the scope of the walrus operator will just force people organizing there code where they will use a normal assignment. So the resulting code will not be faster, less complex or generate less error messages because the complexity of the assignment that is needed is still the same. Or you force people to be "creative" as follows: Suppose I would like to write a loop as follows: while ((a, b) := next_couple(a, b))[1]: do needed calculations What I can do is write it as follows: while [tmp := next_couple(a,b), a := tmp[0], b := tmp[1]][-1]: do needed calculations I really don't see what is gained by "forcing" me to right the second code over the first. -- Antoon Pardon -- https://mail.python.org/mailman/listinfo/python-list -- https://mail.python.org/mailman/listinfo/python-list