On Wed, Jun 16, 2021 at 12:44 PM Avi Gross via Python-list <python-list@python.org> wrote: > > Greg, > > My point was not to ASK what python does as much as to ask why it matters to > anyone which way it does it. Using less space at absolutely no real expense > is generally a plus. Having a compiler work too hard, or even ask the code > to work too hard, is often a minus. > > If I initialized the tuples by calling f(5) and g(5) and so on, the compiler > might not even be easily able to figure out that they all return the same > thing. So should it make a run-time addition to the code so that after > calculating the second, it should look around and if it matches the first, > combine them? Again, I have seen languages where the implementation is to > have exactly one copy of each unique string of characters. That can be > useful but if it is done by creating some data structure and searching > through it even when we have millions of unique strings, ...
"some data structure" being, in all probability, a hashtable, so that searching through it is fast :) Python lets you do this via sys.intern(), and CPython automatically does it for any strings that look like identifiers. Your analyses are both correct; but the compiler is generally allowed to work hard, because its work can be dumped out into a file for next time. Consider this code: def make_adder(n): def adder(values): return [v + n for v in values] return adder This has four distinct code blocks (module, make_adder, adder, and the list comp), all of which are represented as immutable code objects, and can be saved into the .pyc file. If the compiler has to do extra work, that's fine! And in fact, it does a LOT of constant folding and other optimizations, based on what it can know about immediately. Everything else (constructing function objects, constructing lists, etc) is done at run time, although sometimes it's worth considering "function definition time" as a separate phase of execution (since, in many Python modules, that all happens at the very start of execution). Optimizations done at that point are potentially more expensive, since they might be done multiple times - imagine calling make_adder in a loop. But since the code object can be used as-is, it's no trouble to keep all the optimizations :) ChrisA -- https://mail.python.org/mailman/listinfo/python-list