[Dino has a deliberately invalid email address so sending him anything privately is not an option.]
Dino, I would agree with you that for some purposes, you do NOT need to dig deep into a language to get fairly routine things done. You can often borrow ideas and code from an online search and hopefully cobble "a" solution together that works well enough. Of course it may suddenly fall apart. An example is code written that assumes 4 specific files exist in the current directory and unconditionally reads them and does stuff and eventually overwrites some other file with new data. OOPS, what does the program do it one or more files do not exist in the current directory under those exact names? Does it check if they exist and exit gracefully, or start throwing error messages as the code blindly proceeds on trying to change variables that were never created and so on, and then end with perhaps an emptied file and ruin lots of things? Too many examples you get from the internet are a bit like that. They often just tell you how to do something specific but leave out lots of details that are a good idea for many kinds of code. And some adjustments you make may break things and produce a result you trust but that is actually wrong. So if you find something you don't like about the language, guess what. You have very little standing if you never learned much more than what it took to copy some code. You are likely to be told things by others ranging from suggesting you need to do it another way or that the language is doing exactly what was designed and suggestions you go back to school and get a proper education and then the answer may be obvious. It truly does NOT matter what YOU think should happen unless you point to documentation that says something else should have happened. I will skip a lengthier reply but am thinking how some languages use a boxing/unboxing approach to wrap native "objects" like an integer. In many cases, your program is allowed to treat this as either a wrapped object or unboxed as needed and the compiler or interpreter keeps making changes behind the scenes so you see it as needed. In a sense, Booleans are a restricted form of integer and are viewed one of several ways. Python goes many steps further and has a concept of truthy that maps practically anything to True or False. The bottom line is not that Python or any language is wrong or right or great or horrible. It is that based on your own admission, you have not taken steps to understand more than you have to and thus are in a weak position to argue anything. Not because any of us are smarter in some sense, but because some of us do study more intensively and come ready to re-evaluate our ideas when we encounter others. What Python does in the situation discussed is pretty much what an assortment of languages include many C-based ones do. If it happens that your background is limited, fine. Many of us here have been exposed to the ideas in 5 or a dozen or literally hundreds of languages and variants and are well aware that some languages treat Booleans differently. But note we are posting on this forum so we generally don't find it that objectionable the way Python has chosen. We welcome well-intentioned discussions and the question is not at all stupid. But our answers may not be being seen as reasonable and that can be of some concern. The answer remains that the language was designed this way and many are happy with the design. Interestingly, I wonder if anyone has designed an alternate object type that can be used mostly in place of Booleans but which imposes changes and restrictions so trying to add a Boolean to an integer, or vice versa, results in an error. Python is flexible enough to do that and perhaps there already is a module out there ... -----Original Message----- From: Python-list <python-list-bounces+avi.e.gross=gmail....@python.org> On Behalf Of Dino Sent: Thursday, January 26, 2023 9:26 AM To: python-list@python.org Subject: Re: RE: bool and int Wow. That was quite a message and an interesting read. Tempted to go deep and say what I agree and what I disagree with, but there are two issues: 1) time 2) I will soon be at a disadvantage discussing with people (you or others) who know more than me (which doesn't make them right necessarily, but certainly they'll have the upper-hand in a discussion). Personally, in the first part of my career I got into the habit of learning things fast, sometimes superficially I confess, and then get stuff done hopefully within time and budget. Not the recommended approach if you need to build software for a nuclear plant. An OK approach (within reason) if you build websites or custom solutions for this or that organization and the budget is what it is. After all, technology moves sooo fast, and what we learn in detail today is bound to be old and possibly useless 5 years down the road. Also, I argue that there is value in having familiarity with lots of different technologies (front-end and back-end) and knowing (or at lease, having a sense) of how they can all be made play together with an appreciation of the different challenges and benefits that each domain offers. Anyway, everything is equivalent to a Turing machine and IA will screw everyone, including programmers, eventually. Thanks again and have a great day Dino On 1/25/2023 9:14 PM, avi.e.gr...@gmail.com wrote: > Dino, > > There is no such things as a "principle of least surprise" or if you > insist there is, I can nominate many more such "rules" such as "the > principle of get out of my way and let me do what I want!" > > Computer languages with too many rules are sometimes next to unusable > in practical situations. > > I am neither defending or attacking choices Python or other languages > have made. I merely observe and agree to use languages carefully and > as documented. > -- https://mail.python.org/mailman/listinfo/python-list -- https://mail.python.org/mailman/listinfo/python-list